CN113190047B - Unmanned aerial vehicle group path recognition method based on two-dimensional plane - Google Patents

Unmanned aerial vehicle group path recognition method based on two-dimensional plane Download PDF

Info

Publication number
CN113190047B
CN113190047B CN202110595220.3A CN202110595220A CN113190047B CN 113190047 B CN113190047 B CN 113190047B CN 202110595220 A CN202110595220 A CN 202110595220A CN 113190047 B CN113190047 B CN 113190047B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
aerial vehicles
height
vehicles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110595220.3A
Other languages
Chinese (zh)
Other versions
CN113190047A (en
Inventor
鲁仁全
雷群楼
陶杰
翁剑鸿
林明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202110595220.3A priority Critical patent/CN113190047B/en
Publication of CN113190047A publication Critical patent/CN113190047A/en
Application granted granted Critical
Publication of CN113190047B publication Critical patent/CN113190047B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a two-dimensional plane-based unmanned aerial vehicle group path identification method, which comprises the following steps: detecting GPS signals in real time: detecting GPS signal conditions in real time by the unmanned aerial vehicle group, stopping advancing by the unmanned aerial vehicle group when the GPS signal is detected to be smaller than a set value, and switching to an autonomous navigation mode; determining a temporary director: numbering and sequencing n unmanned aerial vehicles, wherein each unmanned aerial vehicle sends a signal to all other unmanned aerial vehicles to request space coordinates, all unmanned aerial vehicles receiving the request return to one space coordinate, and the unmanned aerial vehicle receiving n-1 space coordinates is used as a temporary director; then adjusting the height of the unmanned aerial vehicle group, determining the central unmanned aerial vehicle, collecting ground images, identifying roads, planning paths and moving the unmanned aerial vehicle group. The application aims to provide a two-dimensional plane-based unmanned aerial vehicle group path recognition method, which can be used for navigating and path planning in a low-altitude complex environment aiming at a method that unmanned aerial vehicle groups need to cooperatively fly in the complex low-altitude environment.

Description

Unmanned aerial vehicle group path recognition method based on two-dimensional plane
Technical Field
The application relates to the field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle group path identification method based on a two-dimensional plane.
Background
The unmanned aerial vehicle is an unmanned aerial vehicle, and compared with a manned aerial vehicle, the unmanned aerial vehicle has the advantages of small size, low cost, convenience in use and the like. In addition to military applications, unmanned aerial vehicles are also widely used for civil applications, such as: aerial photography, agriculture, plant protection, power inspection, mapping and the like.
The unmanned aerial vehicle has the remarkable advantages of strong load capacity, strong maneuverability, high space utilization rate and the like. This makes the drone swarm a great advantage in co-completion tasks, but in complex urban environments, co-flying remains a significant challenge for the drone. A scheme of GPS positioning and navigation in a known map may be employed; the method can also adopt a visual mode to build a map of an unknown environment in real time, thereby achieving the purpose of navigation.
Existing solutions have the option of using GPS positioning and navigating in a known map, which has two preconditions: firstly, good GPS signals need to be maintained, and the GPS signals cannot be interrupted; secondly, the unmanned aerial vehicle stores a map. However, some places have no map, so that the map cannot be obtained in advance, and the existence of the GPS signal cannot be guaranteed. Generally, a visual mode is adopted to carry out real-time mapping on an unknown environment, so that the navigation purpose is achieved, but the real-time mapping calculation amount is large, the required memory is large, the equipment requirement is high, the calculation performance of the existing embedded equipment is not high, and the real-time requirement of a collaborative task is difficult to meet by adopting the real-time mapping mode.
Disclosure of Invention
The application aims to provide a two-dimensional plane-based unmanned aerial vehicle group path recognition method, which can be used for navigating and path planning in a low-altitude complex environment according to a method that unmanned aerial vehicle groups need to cooperatively fly in the complex low-altitude environment.
To achieve the purpose, the application adopts the following technical scheme: a two-dimensional plane-based unmanned aerial vehicle group path identification method comprises the following steps:
detecting GPS signals in real time: detecting GPS signal conditions in real time by the unmanned aerial vehicle group, stopping advancing by the unmanned aerial vehicle group when the GPS signal is detected to be smaller than a set value, and switching from a GPS navigation mode to an autonomous navigation mode;
determining a temporary director: numbering and sequencing n unmanned aerial vehicles, wherein each unmanned aerial vehicle sends a signal to all other unmanned aerial vehicles to request space coordinates, all unmanned aerial vehicles receiving the request return to one space coordinate, and the unmanned aerial vehicle receiving n-1 space coordinates is used as a temporary director;
adjusting the height of the unmanned aerial vehicle group: taking the height of the temporary director as a reference, controlling all other unmanned aerial vehicles to adjust the height so that the height of the unmanned aerial vehicles is kept consistent with the height of the temporary director;
determining a central unmanned aerial vehicle: calculating the Euclidean distance sum between each unmanned aerial vehicle and the rest unmanned aerial vehicles, comparing the obtained n Euclidean distance sums, and taking the unmanned aerial vehicle with the smallest Euclidean distance sum as the central unmanned aerial vehicle;
collecting ground images: lifting the height of the central unmanned aerial vehicle at a constant speed, and photographing the ground by using a camera carried by the central unmanned aerial vehicle in the lifting process;
identifying a road: after the central unmanned aerial vehicle ascends to a set height, photographing the whole area, processing the image of the whole area and identifying the road;
path planning: the image of the whole area is networked, and path planning is carried out on the position of the whole unmanned aerial vehicle group;
unmanned aerial vehicle crowd removes: and obtaining the moving distance of the unmanned aerial vehicle group by utilizing the relation between the distance of the image and the actual distance, and controlling the unmanned aerial vehicle group to move.
Preferably, in the step of detecting the GPS signal in real time: when the GPS signal is detected to be smaller than the set value, the GPS signal currently acquired by the unmanned aerial vehicle group and the space coordinate acquired by the sensor of the unmanned aerial vehicle are recorded.
Preferably, in the step of determining the temporary director: if one unmanned aerial vehicle receives a request sent by a plurality of unmanned aerial vehicles as a temporary director at the same time, the unmanned aerial vehicle with the smallest number is taken as the temporary director in the plurality of unmanned aerial vehicles, and the temporary director numbers are broadcasted to all unmanned aerial vehicles.
Preferably, in the step of adjusting the height of the unmanned aerial vehicle group, the method further includes: coordinate anti-collision detection is carried out, the whole length of the unmanned aerial vehicle is set to be y, the width of the unmanned aerial vehicle is set to be x, and the space coordinate of any unmanned aerial vehicle is set to be (x) a ,y a ,z a ) The spatial coordinates of another adjacent unmanned aerial vehicle are (x) b ,y b ,z b ) If the space coordinates of the two unmanned aerial vehicles meet the requirement of |x a -x b |<4x,|x a -x b |<4y,|y a -y b |<4x,|y a -y b |<4y, judging that two unmanned aerial vehicles collide, and controlling the corresponding unmanned aerial vehicles to transversely move towards the area with small unmanned aerial vehicles by the temporary director until all the unmanned aerial vehicle space coordinates do not meet the collision condition; the temporary director controls the rest unmanned aerial vehicle to ascend or descend to the position with the same height as the temporary director.
Preferably, in the step of determining the central drone: let the spatial coordinates of n unmanned aerial vehicles be (x) 1 ,y 1 ,z 1 ),(x 2 ,y 2 ,z 2 ),…,(x n ,y n ,z n ) Wherein z is 1 =z 2 =…z n The sum of the distances between the first unmanned aerial vehicle and the rest n-1 unmanned aerial vehicles is as follows:the same principle can be obtained: distance from second frame to rest n-1 unmanned aerial vehicle and E 2 Distance from n-th unmanned aerial vehicle to other n-1 unmanned aerial vehicles and E n The method comprises the steps of carrying out a first treatment on the surface of the Obtaining E 1 To E to n The minimum value of (2) is E i And selecting the ith unmanned aerial vehicle as a central unmanned aerial vehicle.
Preferably, in the step of acquiring the ground image: and (3) uniformly raising the height of the central unmanned aerial vehicle, stopping after the central unmanned aerial vehicle rises by the set height, and determining the space coordinate positions of the rest unmanned aerial vehicles by the central unmanned aerial vehicle through identifying the elliptical protection rings of the rest unmanned aerial vehicles:
in the lifting process of the central unmanned aerial vehicle, the camera continuously acquires multi-frame pictures, the position range of the rest unmanned aerial vehicle in the current picture in the current frame is determined according to the initial positions of the rest unmanned aerial vehicles in the previous frame picture, the area in the corresponding range is amplified and identified, the coordinate value of the unmanned aerial vehicle in the amplified image is identified, and the coordinate value and the picture which is not amplified originally are subjected to coordinate transformation, so that the coordinates of the current positions of the rest unmanned aerial vehicles are obtained.
Preferably, in the step of identifying a road: and (3) graying, gaussian blur, canny edge detection, irregular ROI region interception, hough straight line detection and lane calculation are carried out on the image of the whole region, so that a road is identified.
Preferably, in the step of path planning: after the road is identified, the path planning is carried out on the position of the whole unmanned aerial vehicle group through a dynamic A star algorithm.
Preferably, the step of moving the unmanned aerial vehicle group includes: solving a scale r between the distance of pixels in the picture photographed at the current height and the actual distance of the ground x Scale r x The formula of (2) is:wherein r is x Scale representing x-axis direction, w represents x-axis direction width of pixel of image, f x Represents the focal length of the camera, H represents the height of the camera to the ground, L x Representing the actual distance, i.e. the actual distance in the x-axis direction;
solving a scale r between the distance of pixels in the picture photographed at the current height and the actual distance of the ground y Scale r y The formula of (2) is:h represents the y-axis direction width of the pixel of the image, f x Represents the focal length of the camera, H represents the height of the camera to the ground, L y Representing the actual distance, i.e. the actual distance in the y-axis direction;
let a pixel point P (x 1 ,y 1 ) The length of the pixel point P in the x-axis direction in the image is x 1 Substituted into formulaObtaining the actual distance L in the x-axis direction x The method comprises the steps of carrying out a first treatment on the surface of the The length of the pixel point P in the y-axis direction in the image is y 1 Substituted formula +.>Obtaining the actual distance L in the y-axis direction y
Let the moving arrival point be P 1 Actual distance of movement
By adopting the method, when no GPS signal or unmanned aerial vehicle group moves to a region with poor GPS signal, the unmanned aerial vehicle can be switched to an autonomous navigation mode, real-time path planning is not required to be completed through technologies such as visual synchronous positioning and map drawing or laser synchronous positioning and map drawing, the requirement on hardware configuration of equipment is not high, the requirement on calculated amount can be met by adopting common embedded equipment, the cost of unmanned aerial vehicle equipment can be reduced, and the heating condition of the unmanned aerial vehicle equipment in the operation process is reduced; the central unmanned aerial vehicle is selected as a command center through the spatial distribution condition of the unmanned aerial vehicle group and the communication condition among the unmanned aerial vehicles, so that the operation process is simple; the central unmanned aerial vehicle serves as the center of the whole unmanned aerial vehicle group, positions of other unmanned aerial vehicles are identified, and images of the whole area are collected, analyzed and processed through the camera, so that the path of the whole unmanned aerial vehicle group is conveniently planned and controlled in a unified mode.
Drawings
The present application is further illustrated by the accompanying drawings, which are not to be construed as limiting the application in any way.
FIG. 1 is a schematic view of an imaging of the present application in the x-axis direction;
FIG. 2 is a schematic view of the imaging of the y-axis direction of the present application;
fig. 3 is a position coordinate diagram of a pixel point P of the present application;
fig. 4 is a schematic diagram of the length relationship between the pixel points of the image and the actual ground points of the present application.
Detailed Description
The technical scheme of the application is further described below by the specific embodiments with reference to the accompanying drawings.
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
Referring to fig. 1 to 4, a two-dimensional plane-based unmanned aerial vehicle group path recognition method includes the following steps:
detecting GPS signals in real time: detecting GPS signal conditions in real time by the unmanned aerial vehicle group, stopping advancing by the unmanned aerial vehicle group when the GPS signal is detected to be smaller than a set value, and switching from a GPS navigation mode to an autonomous navigation mode;
determining a temporary director: numbering and sequencing n unmanned aerial vehicles, wherein each unmanned aerial vehicle sends a signal to all other unmanned aerial vehicles to request space coordinates, all unmanned aerial vehicles receiving the request return to one space coordinate, and the unmanned aerial vehicle receiving n-1 space coordinates is used as a temporary director;
adjusting the height of the unmanned aerial vehicle group: taking the height of the temporary director as a reference, controlling all other unmanned aerial vehicles to adjust the height so that the height of the unmanned aerial vehicles is kept consistent with the height of the temporary director;
determining a central unmanned aerial vehicle: calculating the Euclidean distance sum between each unmanned aerial vehicle and the rest unmanned aerial vehicles, comparing the obtained n Euclidean distance sums, and taking the unmanned aerial vehicle with the smallest Euclidean distance sum as the central unmanned aerial vehicle;
collecting ground images: lifting the height of the central unmanned aerial vehicle at a constant speed, and photographing the ground by using a camera carried by the central unmanned aerial vehicle in the lifting process;
identifying a road: after the central unmanned aerial vehicle ascends to a set height, photographing the whole area, processing the image of the whole area and identifying the road;
path planning: the image of the whole area is networked, and path planning is carried out on the position of the whole unmanned aerial vehicle group;
unmanned aerial vehicle crowd removes: and obtaining the moving distance of the unmanned aerial vehicle group by utilizing the relation between the distance of the image and the actual distance, and controlling the unmanned aerial vehicle group to move.
By adopting the method, when no GPS signal exists or the unmanned aerial vehicle group moves to the area with the GPS signal difference, the autonomous navigation mode can be switched, and the central unmanned aerial vehicle is selected as a command center through the spatial distribution condition of the unmanned aerial vehicle group and the communication condition among the unmanned aerial vehicle group, so that the operation process is simple; the center unmanned aerial vehicle is used as the center of the whole unmanned aerial vehicle group, the positions of the other unmanned aerial vehicles are identified, and the images of the whole area are collected, analyzed and processed through the cameras, so that the unified planning and control of the paths of the whole unmanned aerial vehicle group are facilitated, the real-time path planning is completed without the technologies of visual synchronous positioning and map drawing or laser synchronous positioning and map drawing, and the like, the hardware configuration requirement of equipment is not high, the requirement of calculated amount can be met by adopting general embedded equipment, the cost of unmanned aerial vehicle equipment can be reduced, and the heating condition of the unmanned aerial vehicle equipment in the operation process is reduced.
According to the application, the pixel distance of the unmanned aerial vehicle in the image can be converted into the actual distance by utilizing the scale, and the path planning is performed under the 2D plane; in the cooperative control of unmanned aerial vehicles, selecting a central unmanned aerial vehicle according to the spatial distribution condition and the communication state of the unmanned aerial vehicle; determining the relative position between the unmanned aerial vehicle and the command machine by using the distribution characteristics of the elliptical protection rings of the unmanned aerial vehicle; and identifying the road in the 2D plane, and carrying out path planning by adopting a dynamic A star algorithm on the basis.
Specifically, in the step of detecting a GPS signal in real time: when the GPS signal is detected to be smaller than the set value, the GPS signal currently acquired by the unmanned aerial vehicle group and the space coordinate acquired by the sensor of the unmanned aerial vehicle are recorded.
The unmanned aerial vehicle group detects the GPS signal in real time, when the GPS is smaller than the set value, namely the unmanned aerial vehicle group enters the area with poor GPS signal, at the moment, the unmanned aerial vehicle group cannot keep smooth communication signals, the unmanned aerial vehicle group switches to an autonomous navigation mode, route planning is carried out according to the real-time route condition, and smooth flight is completed. And acquiring the GPS signal and the space coordinate data which are acquired currently, and providing reference data for the next flight.
Preferably, in the step of determining the temporary director: if one unmanned aerial vehicle receives a request sent by a plurality of unmanned aerial vehicles as a temporary director at the same time, the unmanned aerial vehicle with the smallest number is taken as the temporary director in the plurality of unmanned aerial vehicles, and the temporary director numbers are broadcasted to all unmanned aerial vehicles.
In order to avoid that a plurality of unmanned aerial vehicles simultaneously request to serve as temporary command machines, control signal interference is generated among the plurality of unmanned aerial vehicles, the unmanned aerial vehicle with the smallest number is taken as the temporary command machine, and subsequent regulation and control on the unmanned aerial vehicle group are completed.
Meanwhile, in the step of adjusting the height of the unmanned aerial vehicle group, the method further comprises the following steps: coordinate anti-collision detection is carried out, the whole length of the unmanned aerial vehicle is set to be y, the width of the unmanned aerial vehicle is set to be x, and the space coordinate of any unmanned aerial vehicle is set to be (x) a ,y a ,z a ) The spatial coordinates of another adjacent unmanned aerial vehicle are (x) b ,y b ,z b ) If the space coordinates of the two unmanned aerial vehicles meet the requirement of |x a -x b |<4x,|x a -x b |<4y,|y a -y b |<4x,|y a -y b |<4y, judging that two unmanned aerial vehicles collide, and controlling the corresponding unmanned aerial vehicles to transversely move towards the area with small unmanned aerial vehicles by the temporary command machine until all the unmanned aerial vehicle space coordinates do not meet the collision condition; the temporary director controls the rest unmanned aerial vehicle to ascend or descend to the position with the same height as the temporary director.
When interim commander machine adjustment unmanned aerial vehicle crowd is at unified high in-process, in order to prevent to bump between two adjacent unmanned aerial vehicles, then the distance between two arbitrary unmanned aerial vehicles must satisfy foretell space coordinate relation, and here unmanned aerial vehicle is less in quantity the region that the interval is great between two adjacent unmanned aerial vehicles refers to, and unmanned aerial vehicle distribution's in the same region less in quantity region promptly, unmanned aerial vehicle's distribution condition can be obtained by space coordinate distribution condition.
Preferably, in the step of determining the central drone: let the spatial coordinates of n unmanned aerial vehicles be (x) 1 ,y 1 ,z 1 ),(x 2 ,y 2 ,z 2 ),…,(x n ,y n ,z n ) Wherein z is 1 =z 2 =…z n The sum of the distances between the first unmanned aerial vehicle and the rest n-1 unmanned aerial vehicles is as follows:the same principle can be obtained: distance from second frame to rest n-1 unmanned aerial vehicle and E 2 Distance from n-th unmanned aerial vehicle to other n-1 unmanned aerial vehicles and E n The method comprises the steps of carrying out a first treatment on the surface of the Obtaining E 1 To E to n The minimum value of (2) is E i And selecting the ith unmanned aerial vehicle as a central unmanned aerial vehicle.
Solving the sum of Euclidean distances between n unmanned aerial vehicles and the rest n-1 unmanned aerial vehicles, and solving each solving result E 1 To E to n Comparing and judging to obtain the minimum value E i I.e. the distance between the ith unmanned aerial vehicle and the rest n-1 unmanned aerial vehicles is minimum, the communication path of the whole unmanned aerial vehicle group is minimum by taking the ith unmanned aerial vehicle as a central unmanned aerial vehicle, and the signal transmission efficiency is highest.
Euclidean distance is a commonly used distance definition, referring to the true distance between two points in an m-dimensional space, or the natural length of the vector (i.e., the distance from the point to the origin). The euclidean distance in two and three dimensions is the actual distance between two points.
In the application, in the step of collecting the ground image: and (3) uniformly raising the height of the central unmanned aerial vehicle, stopping after the central unmanned aerial vehicle rises by the set height, and determining the space coordinate positions of the rest unmanned aerial vehicles by the central unmanned aerial vehicle through identifying the elliptical protection rings of the rest unmanned aerial vehicles:
in the lifting process of the central unmanned aerial vehicle, the camera continuously acquires multi-frame pictures, the position range of the rest unmanned aerial vehicle in the current picture in the current frame is determined according to the initial positions of the rest unmanned aerial vehicles in the previous frame picture, the area in the corresponding range is amplified and identified, the coordinate value of the unmanned aerial vehicle in the amplified image is identified, and the coordinate value and the picture which is not amplified originally are subjected to coordinate transformation, so that the coordinates of the current positions of the rest unmanned aerial vehicles are obtained.
The oval guard circle is unmanned aerial vehicle from the frame construction who takes, has great volume, makes things convenient for the camera to discern. And continuously shooting a plurality of frames of pictures, and identifying the positions of the pictures, so that the space coordinates of the rest unmanned aerial vehicles are determined. The set ascending height is 4-8m, when the ellipse protection ring is identified, in order to solve the problem that the unmanned aerial vehicle cannot be identified due to small size, the approximate range of the unmanned aerial vehicle in the current frame is determined through the initial position of the unmanned aerial vehicle in the previous frame of picture, the area is further cut, the image of the area is amplified so as to improve the possibility of identifying the unmanned aerial vehicle, after the unmanned aerial vehicle is identified, the coordinate of the amplified image after cutting is obtained, the coordinate is transformed, and the coordinate is transformed back into the picture which is not cut originally. During the ascent, the commander keeps identifying other unmanned aerial vehicles all the time.
Meanwhile, in the step of identifying the road: and (3) graying, gaussian blur, canny edge detection, irregular ROI region interception, hough straight line detection and lane calculation are carried out on the image of the whole region, so that a road is identified.
Graying: in the RGB model, if r=g=b, the color represents a gray color, where the value of r=g=b is called a gray value, and thus, only one byte is required for each pixel of the gray image to store the gray value, and the gray range is 0 to 255.
Gaussian fuzzification: for use in the preprocessing stage of the vision algorithm to enhance the image effect of the image at different scale sizes.
Canny edge detection is a multi-level edge detection algorithm developed by John f.canny in 1986.
Irregular ROI region truncation: in image processing, a region to be processed, called an ROI region, is outlined from a processed image in the form of a square, a circle, an ellipse, an irregular polygon, or the like.
Hough straight line detection: the method can detect the shape of a circle, a straight line, an ellipse and the like by calculating the local maximum value of the accumulated result in a parameter space to obtain a set conforming to the specific shape as a Hough transformation result.
Specifically, in the step of path planning: after the road is identified, the path planning is carried out on the position of the whole unmanned aerial vehicle group through a dynamic A star algorithm.
The dynamic a star algorithm, namely a.stenz, "The focused D algorithm for real-time replaying," In proc.ijcai, vol.95, pp.1652-1659,1995, is a typical heuristic search algorithm, which is built on The basis of Dijkstra algorithm, and is widely applied to game maps and The real world for finding The shortest path between two points.
Preferably, the step of moving the unmanned aerial vehicle group includes: solving a scale r between the distance of pixels in the picture photographed at the current height and the actual distance of the ground x Scale r x The formula of (2) is:wherein r is x Scale representing x-axis direction, w represents x-axis direction width of pixel of image, f x Represents the focal length of the camera, H represents the height of the camera to the ground, L x Representing the actual distance, i.e. the actual distance in the x-axis direction;
solving a scale r between the distance of pixels in the picture photographed at the current height and the actual distance of the ground y Scale r y The formula of (2) is:h represents the y-axis direction width of the pixel of the image, f x Represents the focal length of the camera, H represents the height of the camera to the ground, L y Representing the actual distance, i.e. the actual distance in the y-axis direction;
let a pixel point P (x 1 ,y 1 ) The length of the pixel point P in the x-axis direction in the image is x 1 Substituted into formulaObtaining the actual distance L in the x-axis direction x The method comprises the steps of carrying out a first treatment on the surface of the The length of the pixel point P in the y-axis direction in the image is y 1 Substituted formula +.>Obtaining the actual distance L in the y-axis direction y
Let the moving arrival point be P 1 Actual distance of movement
Referring to FIG. 1, the image is an imaging diagram in the x-axis direction, the height H from the camera to the ground and the focal length f of the camera x Can be measured by a sensor of the unmanned aerial vehicle, thereby obtaining the actual distance L in the x-axis direction x . Referring to FIG. 2, an imaging diagram in the y-axis direction is shown, and the actual distance L in the y-axis direction is obtained y Then the pixel point P, the coordinates (x 1 ,y 1 ) Length in x-axis direction in the image is x 1 The actual distance in the corresponding x-axis direction can be obtained by a scaleThe length in the y-axis direction in the image is y 1 Its actual distance is +.>In fig. 4, P is the coordinates of the pixel point in the image, and the P1 point is the actual point corresponding to the pixel point P. The distance from the pixel point p to the coordinate center in the image coordinates is +.>Then can pass through L x And L y Obtaining the actual distance->
In the description herein, reference to the term "embodiment," "example," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The technical principle of the present application is described above in connection with the specific embodiments. The description is made for the purpose of illustrating the general principles of the application and should not be taken in any way as limiting the scope of the application. Other embodiments of the application will be apparent to those skilled in the art from consideration of this specification without undue burden.

Claims (7)

1. The unmanned aerial vehicle group path identification method based on the two-dimensional plane is characterized by comprising the following steps of:
detecting GPS signals in real time: detecting GPS signal conditions in real time by the unmanned aerial vehicle group, stopping advancing by the unmanned aerial vehicle group when the GPS signal is detected to be smaller than a set value, and switching from a GPS navigation mode to an autonomous navigation mode;
determining a temporary director: numbering and sequencing n unmanned aerial vehicles, wherein each unmanned aerial vehicle sends a signal to all other unmanned aerial vehicles to request space coordinates, all unmanned aerial vehicles receiving the request return to one space coordinate, and the unmanned aerial vehicle receiving n-1 space coordinates is used as a temporary director;
adjusting the height of the unmanned aerial vehicle group: taking the height of the temporary director as a reference, controlling all other unmanned aerial vehicles to adjust the height so that the height of the unmanned aerial vehicles is kept consistent with the height of the temporary director;
comprising the following steps: coordinate anti-collision detection is carried out, the whole length of the unmanned aerial vehicle is set to be y, the width of the unmanned aerial vehicle is set to be x, and the space coordinate of any unmanned aerial vehicle is set to be (x) a ,y a ,z a ) The spatial coordinates of another adjacent unmanned aerial vehicle are (x) b ,y b ,z b ) If the space coordinates of the two unmanned aerial vehicles meet the requirement of |x a -x b |<4x,|x a -x b |<4y,|y a -y b |<4x,|y a -y b |<4y, judging that two unmanned aerial vehicles collide, and controlling the corresponding unmanned aerial vehicles to transversely move towards the area with small unmanned aerial vehicles by the temporary director until all the unmanned aerial vehicle space coordinates do not meet the collision condition; the temporary director controls the other unmanned aerial vehicles to ascend or descend to the position with the same height as the temporary director;
determining a central unmanned aerial vehicle: calculating the Euclidean distance sum between each unmanned aerial vehicle and the rest unmanned aerial vehicles, comparing the obtained n Euclidean distance sums, and taking the unmanned aerial vehicle with the smallest Euclidean distance sum as the central unmanned aerial vehicle;
collecting ground images: lifting the height of the central unmanned aerial vehicle at a constant speed, and photographing the ground by using a camera carried by the central unmanned aerial vehicle in the lifting process;
identifying a road: after the central unmanned aerial vehicle ascends to a set height, photographing the whole area, processing the image of the whole area and identifying the road;
path planning: the image of the whole area is networked, and path planning is carried out on the position of the whole unmanned aerial vehicle group;
unmanned aerial vehicle crowd removes: obtaining the moving distance of the unmanned aerial vehicle group by utilizing the relation between the distance of the image and the actual distance, and controlling the unmanned aerial vehicle group to move;
comprising the following steps: solving a scale r between the distance of pixels in the picture photographed at the current height and the actual distance of the ground x Scale r x The formula of (2) is:wherein r is x Scale representing x-axis direction, w represents x-axis direction width of pixel of image, f x Represents the focal length of the camera, H represents the height of the camera to the ground, L x Representing the actual distance, i.e. the actual distance in the x-axis direction;
solving a scale r between the distance of pixels in the picture photographed at the current height and the actual distance of the ground y Scale r y The formula of (2) is:h represents the y-axis direction width of the pixel of the image, f x Represents the focal length of the camera, H represents the height of the camera to the ground, L y Representing the actual distance, i.e. the actual distance in the y-axis direction;
let a pixel point P (x 1 ,y 1 ) The length of the pixel point P in the x-axis direction in the image is x 1 Substituted into formulaObtaining the actual distance L in the x-axis direction x The method comprises the steps of carrying out a first treatment on the surface of the The length of the pixel point P in the y-axis direction in the image is y 1 Substituted formula +.>Obtaining the actual distance L in the y-axis direction y
Let the moving arrival point be P 1 Actual distance of movement
2. The method for identifying a path of an unmanned aerial vehicle based on a two-dimensional plane according to claim 1, wherein in the step of detecting a GPS signal in real time: when the GPS signal is detected to be smaller than the set value, the GPS signal currently acquired by the unmanned aerial vehicle group and the space coordinate acquired by the sensor of the unmanned aerial vehicle are recorded.
3. The method for identifying a path of a group of unmanned aerial vehicles based on a two-dimensional plane according to claim 1, wherein in the step of determining a temporary director: if one unmanned aerial vehicle receives a request sent by a plurality of unmanned aerial vehicles as a temporary director at the same time, the unmanned aerial vehicle with the smallest number is taken as the temporary director in the plurality of unmanned aerial vehicles, and the temporary director numbers are broadcasted to all unmanned aerial vehicles.
4. The method for identifying a path of a group of unmanned aerial vehicles based on a two-dimensional plane according to claim 1, wherein in the step of determining a central unmanned aerial vehicle: let the spatial coordinates of n unmanned aerial vehicles be (x) 1 ,y 1 ,z 1 ),(x 2 ,y 2 ,z 2 ),…,(x n ,y n ,z n ) Wherein z is 1 =z 2 =…z n The sum of the distances between the first unmanned aerial vehicle and the rest n-1 unmanned aerial vehicles is as follows:the same principle can be obtained: distance from second frame to rest n-1 unmanned aerial vehicle and E 2 Distance from n-th unmanned aerial vehicle to other n-1 unmanned aerial vehicles and E n The method comprises the steps of carrying out a first treatment on the surface of the Obtaining E 1 To E to n The minimum value of (2) is E i And selecting the ith unmanned aerial vehicle as a central unmanned aerial vehicle.
5. The method for identifying a path of an unmanned aerial vehicle based on a two-dimensional plane according to claim 1, wherein in the step of acquiring the ground image: and (3) uniformly raising the height of the central unmanned aerial vehicle, stopping after the central unmanned aerial vehicle rises by the set height, and determining the space coordinate positions of the rest unmanned aerial vehicles by the central unmanned aerial vehicle through identifying the elliptical protection rings of the rest unmanned aerial vehicles:
in the lifting process of the central unmanned aerial vehicle, the camera continuously acquires multi-frame pictures, the position range of the rest unmanned aerial vehicle in the current picture in the current frame is determined according to the initial positions of the rest unmanned aerial vehicles in the previous frame picture, the area in the corresponding range is amplified and identified, the coordinate value of the unmanned aerial vehicle in the amplified image is identified, and the coordinate value and the picture which is not amplified originally are subjected to coordinate transformation, so that the coordinates of the current positions of the rest unmanned aerial vehicles are obtained.
6. The method for recognizing a path of a group of unmanned aerial vehicles based on a two-dimensional plane according to claim 1, wherein in the step of recognizing a road: and (3) graying, gaussian blur, canny edge detection, irregular ROI region interception, hough straight line detection and lane calculation are carried out on the image of the whole region, so that a road is identified.
7. The method for identifying a path of an unmanned aerial vehicle based on a two-dimensional plane according to claim 1, wherein in the step of path planning: after the road is identified, the path planning is carried out on the position of the whole unmanned aerial vehicle group through a dynamic A star algorithm.
CN202110595220.3A 2021-05-28 2021-05-28 Unmanned aerial vehicle group path recognition method based on two-dimensional plane Active CN113190047B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110595220.3A CN113190047B (en) 2021-05-28 2021-05-28 Unmanned aerial vehicle group path recognition method based on two-dimensional plane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110595220.3A CN113190047B (en) 2021-05-28 2021-05-28 Unmanned aerial vehicle group path recognition method based on two-dimensional plane

Publications (2)

Publication Number Publication Date
CN113190047A CN113190047A (en) 2021-07-30
CN113190047B true CN113190047B (en) 2023-09-05

Family

ID=76986351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110595220.3A Active CN113190047B (en) 2021-05-28 2021-05-28 Unmanned aerial vehicle group path recognition method based on two-dimensional plane

Country Status (1)

Country Link
CN (1) CN113190047B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015024407A1 (en) * 2013-08-19 2015-02-26 国家电网公司 Power robot based binocular vision navigation system and method based on
CN110823223A (en) * 2019-10-16 2020-02-21 中国人民解放军国防科技大学 Path planning method and device for unmanned aerial vehicle cluster
CN111123341A (en) * 2019-11-15 2020-05-08 西安电子科技大学 Three-dimensional co-location method for unmanned aerial vehicle group
CN111256682A (en) * 2020-05-07 2020-06-09 北京航空航天大学 Unmanned aerial vehicle group path planning method under uncertain condition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015024407A1 (en) * 2013-08-19 2015-02-26 国家电网公司 Power robot based binocular vision navigation system and method based on
CN110823223A (en) * 2019-10-16 2020-02-21 中国人民解放军国防科技大学 Path planning method and device for unmanned aerial vehicle cluster
CN111123341A (en) * 2019-11-15 2020-05-08 西安电子科技大学 Three-dimensional co-location method for unmanned aerial vehicle group
CN111256682A (en) * 2020-05-07 2020-06-09 北京航空航天大学 Unmanned aerial vehicle group path planning method under uncertain condition

Also Published As

Publication number Publication date
CN113190047A (en) 2021-07-30

Similar Documents

Publication Publication Date Title
CN110221603B (en) Remote obstacle detection method based on laser radar multi-frame point cloud fusion
US10515271B2 (en) Flight device and flight control method
JP7252943B2 (en) Object detection and avoidance for aircraft
CN108873943B (en) Image processing method for centimeter-level accurate landing of unmanned aerial vehicle
CN110825101B (en) Unmanned aerial vehicle autonomous landing method based on deep convolutional neural network
CN112710318B (en) Map generation method, path planning method, electronic device, and storage medium
CN109891351B (en) Method and system for image-based object detection and corresponding movement adjustment manipulation
CN109191504A (en) A kind of unmanned plane target tracking
US11900668B2 (en) System and method for identifying an object in water
KR101261409B1 (en) System for recognizing road markings of image
CN111338382B (en) Unmanned aerial vehicle path planning method guided by safety situation
CN106444837A (en) Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
EP2166375B1 (en) System and method of extracting plane features
CN112802196B (en) Binocular inertia simultaneous positioning and map construction method based on dotted line feature fusion
CN112927264A (en) Unmanned aerial vehicle tracking shooting system and RGBD tracking method thereof
CN111510704A (en) Method for correcting camera dislocation and device using same
CN109490926B (en) Path planning method based on binocular camera and GNSS
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN113190047B (en) Unmanned aerial vehicle group path recognition method based on two-dimensional plane
CN108765444A (en) Ground T shape Moving objects detection and location methods based on monocular vision
CN116105721B (en) Loop optimization method, device and equipment for map construction and storage medium
CN115797397A (en) Method and system for robot to autonomously follow target person in all weather
CN116228849B (en) Navigation mapping method for constructing machine external image
Min Binocular stereo vision control method for landing position of four rotor UAV
CN112731918B (en) Ground unmanned platform autonomous following system based on deep learning detection tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant