CN109324337B - Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle - Google Patents
Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle Download PDFInfo
- Publication number
- CN109324337B CN109324337B CN201710641017.9A CN201710641017A CN109324337B CN 109324337 B CN109324337 B CN 109324337B CN 201710641017 A CN201710641017 A CN 201710641017A CN 109324337 B CN109324337 B CN 109324337B
- Authority
- CN
- China
- Prior art keywords
- feature
- information
- aerial vehicle
- unmanned aerial
- map data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000005259 measurement Methods 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 14
- 238000004422 calculation algorithm Methods 0.000 claims description 11
- 238000000605 extraction Methods 0.000 claims description 9
- 238000003860 storage Methods 0.000 claims description 9
- 238000005520 cutting process Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000012937 correction Methods 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims description 3
- 230000008520 organization Effects 0.000 claims 1
- 230000008569 process Effects 0.000 abstract description 14
- 230000006870 function Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 241000270295 Serpentes Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The embodiment of the invention provides a method and a device for generating and positioning a route of an unmanned aerial vehicle and the unmanned aerial vehicle, wherein the route generating method comprises the following steps: acquiring map data; extracting map data of a working area to be operated from the map data; acquiring first characteristic information from the map data of the operation area; and planning a route of the map data of the operation area based on the first characteristic information to obtain route information. The course planning process of the embodiment of the invention does not need GPS, so that the unmanned aerial vehicle can still carry out course planning under the condition of no GPS signal or shielding of the GPS signal, the dependence degree of the unmanned aerial vehicle on the GPS is reduced, and the course planning and navigation functions of the unmanned aerial vehicle are enriched.
Description
Technical Field
The present invention relates to the field of unmanned aerial vehicle technology, and in particular, to a method for generating a route of an unmanned aerial vehicle, a method for positioning, a route generating device of an unmanned aerial vehicle, an aircraft, and a computer-readable storage medium.
Background
An Unmanned Aerial Vehicle (Unmanned Aerial Vehicle, UAV for short) is an Unmanned Aerial Vehicle. The unmanned aerial vehicle has wide application and is often applied to industries such as plant protection, city management, geology, meteorology, electric power, emergency and disaster relief, video shooting and the like.
The route task to be performed by the unmanned aerial vehicle is route information obtained by planning a route for a working object in a working area according to a certain rule, and the working area may be a closed area formed by a series of GPS (Global Positioning System) coordinates.
After the unmanned aerial vehicle obtains the route information, the unmanned aerial vehicle also needs to obtain its own position by positioning in the process of executing operation, and in the prior art, a position reference can be provided for the unmanned aerial vehicle based on a GPS or a Real-time kinematic (RTK) -GPS (Real-time kinematic differential method) with higher precision.
Therefore, the air route planning and the operation process in the prior art work based on the GPS, the air route planning and the operation process greatly depend on the GPS, and once the air route planning and the operation process have no GPS signals or are shielded, the unmanned aerial vehicle cannot work normally.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are provided to provide a method of generating routes for an unmanned aerial vehicle, a method of positioning and a corresponding route generation apparatus for an unmanned aerial vehicle, an aircraft and a computer-readable storage medium that overcome or at least partially address the above problems.
In order to solve the above problem, an embodiment of the present invention discloses a method for generating a route of an unmanned aerial vehicle, where the method includes:
acquiring map data;
extracting map data of a working area to be operated from the map data;
acquiring first characteristic information from the map data of the operation area;
and planning a route of the map data of the operation area based on the first characteristic information to obtain route information.
Preferably, the step of extracting the map data of the work area to be worked from the map data includes:
acquiring a plurality of measurement points marked in the map data;
and carrying out image cutting on the closed area surrounded by the plurality of measuring points to obtain map data of the operation area.
Preferably, the first feature information at least includes a feature point, a feature descriptor corresponding to the feature point, and a feature dictionary corresponding to the feature point, and the step of obtaining the first feature information from the map data of the work area includes:
dividing the map data of the operation area into a plurality of block data;
respectively extracting a preset number of feature points from each block data;
generating a feature descriptor corresponding to the feature point;
and generating a feature dictionary corresponding to the feature points.
Preferably, the unmanned aerial vehicle comprises an onboard camera, and the track information comprises a waypoint and attribute information of the waypoint; the step of planning the route of the map data of the operation area based on the first characteristic information to obtain the route information comprises the following steps:
carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
respectively acquiring three-dimensional coordinate data of the waypoints;
for each waypoint, collecting a feature descriptor and a feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as a center, and generating a feature set;
organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint.
Preferably, in the track information, the distance between two waypoints is determined by the overlapping degree of two images continuously acquired by the onboard camera.
Preferably, the step of respectively acquiring the three-dimensional coordinate data of the waypoints comprises:
acquiring the geographic position coordinates of the pixels corresponding to the waypoints as plane coordinates;
acquiring the height of a pixel corresponding to the waypoint;
acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning;
and taking the height and the sum of the height to the ground as a height coordinate.
The embodiment of the invention also discloses a positioning method, which is applied to an unmanned aerial vehicle, wherein the unmanned aerial vehicle comprises an airborne camera, and the method comprises the following steps:
acquiring track information, wherein the track information comprises attribute information of a plurality of waypoints;
acquiring real-time image data acquired by the airborne camera according to a preset interval;
and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
Preferably, the step of determining the current position information of the unmanned aerial vehicle based on the track information and the real-time image data comprises:
judging whether the unmanned aerial vehicle deviates from a route corresponding to the flight path information or not based on the flight path information and the real-time image data;
if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle;
and determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
Preferably, the attribute information includes a feature set associated with the waypoint, where the feature set includes a plurality of feature points and a first feature dictionary corresponding to each feature point;
the step of judging whether the unmanned aerial vehicle deviates from the route corresponding to the flight path information or not based on the flight path information and the real-time image data comprises the following steps:
extracting second feature information from the real-time image data, wherein the second feature information comprises a second feature dictionary;
calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and if the first characteristic dictionary with the matching degree larger than the preset threshold value does not exist, judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information.
Preferably, the step of acquiring temporary positioning information of the unmanned aerial vehicle includes:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the region determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
Preferably, the feature set further includes a first feature descriptor corresponding to each feature point, the second feature information includes a second feature descriptor, and the step of determining the current position information of the unmanned aerial vehicle based on the temporary positioning information includes:
determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints;
matching the second feature descriptor with the first feature descriptor to determine a matched feature point sequence;
and determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence.
Preferably, the step of determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence comprises:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
Preferably, the method further comprises:
determining the position offset of the current position information and the track information;
and correcting the current position information based on the position offset.
The embodiment of the invention also discloses a route generation device of the unmanned aerial vehicle, which comprises the following components:
the map data acquisition module is used for acquiring map data;
the operation area map data acquisition module is used for extracting operation area map data to be operated from the map data;
the first characteristic information acquisition module is used for acquiring first characteristic information from the map data of the operation area;
and the route planning module is used for planning routes of the map data of the operation area based on the first characteristic information to obtain route information.
Preferably, the work area map data acquisition module includes:
a measurement point marking sub-module for acquiring a plurality of measurement points marked in the map data;
and the image cutting sub-module is used for carrying out image cutting on the closed area formed by the plurality of measuring points to obtain the map data of the operation area.
Preferably, the first feature information at least includes a feature point, a feature descriptor corresponding to the feature point, and a feature dictionary corresponding to the feature point, and the first feature information acquiring module includes:
the map segmentation submodule is used for segmenting the map data of the operation area into a plurality of block data;
the characteristic point extraction submodule is used for extracting a preset number of characteristic points from each block data respectively;
the characteristic descriptor generation submodule is used for generating a characteristic descriptor corresponding to the characteristic point;
and the feature dictionary generating submodule is used for generating a feature dictionary corresponding to the feature points.
Preferably, the unmanned aerial vehicle comprises an onboard camera, and the track information comprises a waypoint and attribute information of the waypoint; the route planning module comprises:
the planning submodule is used for carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
the three-dimensional coordinate data acquisition submodule is used for respectively acquiring the three-dimensional coordinate data of the waypoints;
the characteristic set acquisition submodule is used for acquiring a characteristic descriptor and a characteristic dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center for each waypoint to generate a characteristic set;
and the organizing submodule is used for organizing the three-dimensional coordinate data and the feature set into the attribute information of the waypoint.
Preferably, in the track information, the distance between two waypoints is determined by the overlapping degree of two images continuously acquired by the onboard camera.
Preferably, the three-dimensional coordinate data acquisition sub-module includes:
the plane coordinate acquisition unit is used for acquiring the geographic position coordinates of the pixels corresponding to the waypoints as plane coordinates;
the height coordinate acquisition unit is used for acquiring the height of the pixel corresponding to the waypoint; acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning; and taking the height and the sum of the height to the ground as a height coordinate.
The embodiment of the invention also discloses an unmanned aerial vehicle, which comprises an airborne camera, and the unmanned aerial vehicle further comprises:
the flight path information acquisition module is used for acquiring flight path information, and the flight path information comprises attribute information of a plurality of flight points;
the real-time image acquisition module is used for acquiring real-time image data acquired by the airborne camera according to preset intervals;
and the positioning module is used for determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
Preferably, the positioning module comprises:
the flight path deviation judging submodule is used for judging whether the unmanned aerial vehicle deviates from a flight path corresponding to the flight path information or not based on the flight path information and the real-time image data;
the temporary positioning information acquisition submodule is used for acquiring temporary positioning information of the unmanned aerial vehicle if the unmanned aerial vehicle does not deviate from the air route;
and the current position information acquisition submodule is used for determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
Preferably, the attribute information includes a feature set associated with the waypoint, where the feature set includes a plurality of feature points and a first feature dictionary corresponding to each feature point;
the lane departure judgment submodule includes:
a second feature information extraction unit configured to extract second feature information from the real-time image data, the second feature information including a second feature dictionary;
the matching degree calculation unit is used for calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and the deviation unit is used for judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information if the first characteristic dictionary with the matching degree larger than the preset threshold does not exist.
Preferably, the temporary positioning information obtaining sub-module is further configured to:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the region determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
Preferably, the feature set further includes a first feature descriptor corresponding to each feature point, the second feature information includes a second feature descriptor, and the current location information obtaining sub-module includes:
the neighborhood waypoint determining unit is used for determining neighborhood waypoints based on the temporary positioning information and acquiring a first feature descriptor associated with the neighborhood waypoints;
the matching unit is used for matching the second feature descriptor with the first feature descriptor and determining a matched feature point sequence;
and the position determining unit is used for determining the current position information of the unmanned aerial vehicle based on the matched characteristic point sequence.
Preferably, the position determination unit is further configured to:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
Preferably, the unmanned aerial vehicle further comprises:
the position offset determining module is used for determining the position offset of the current position information and the track information;
and the offset correction module is used for correcting the current position information based on the position offset.
The embodiment of the invention also discloses an aircraft, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and is characterized in that the steps of the method are realized when the processor executes the program.
The embodiment of the invention also discloses a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and the computer program realizes the steps of the method when being executed by a processor.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the air route planning can be carried out according to the acquired map data, and the characteristic matching is carried out by combining the image data acquired by the airborne camera, so that the precise positioning of the unmanned aerial vehicle can be carried out under the condition of no GPS or no GPS signal.
Drawings
FIG. 1 is a flow chart illustrating the steps of an embodiment of a method for generating routes for an unmanned aerial vehicle according to the present invention;
FIG. 2 is a flow chart of the steps of one embodiment of a method of positioning of the present invention;
FIG. 3 is a block diagram of an embodiment of a route generation apparatus for an unmanned aerial vehicle according to the present invention;
fig. 4 is a block diagram of an embodiment of an unmanned aerial vehicle according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
In the unmanned aerial vehicle plant protection operation process, the unmanned aerial vehicle can be controlled by a flight control system (flying control for short) to complete the whole flight processes of takeoff, air flight, operation task execution, return flight and the like, the flying control is equivalent to the effect of a driver on human-computer interaction on the unmanned aerial vehicle, and the unmanned aerial vehicle is one of the most core technologies of the unmanned aerial vehicle.
Can install the airborne camera in unmanned vehicles, after unmanned vehicles starts, flight control system can control the airborne camera and shoot the image.
In the embodiment of the invention, the air route planning can be carried out according to the acquired map data, and the characteristic matching is carried out by combining the image data acquired by the airborne camera, so that the precise positioning of the unmanned aerial vehicle can be carried out under the condition of no GPS or no GPS signal.
Referring to FIG. 1, a flow chart illustrating steps of an embodiment of a method for generating routes for an unmanned aerial vehicle of the present invention may include the steps of:
in the embodiment of the present invention, map data may be collected in advance, where the map data may be data including geographic location information (three-dimensional coordinate data), and the representation manner of the map data may include Tile (Tile) or TIFF (Tagged Image File Format).
In the implementation, the map data can be obtained through aerial photogrammetry and splicing processing, and the map data can be stored in a server at the cloud end, a database of the unmanned aerial vehicle, a ground station, a remote controller and other positions.
Therefore, the unmanned aerial vehicle can request map data from a server at the cloud end, extract the map data from a local database, receive the map data sent by a ground station or a remote controller, and the like.
In one embodiment, the map data may include high definition map data, where high definition map data refers to map data with a resolution of up to 1:500, or within a resolution of greater than 1: 1000. In practice, the resolution of the map data may affect the accuracy of the positioning of the unmanned aerial vehicle, and the higher the positioning accuracy, the higher the map resolution is required. For example, if the resolution of the map data is 1:500, the positioning accuracy may be greater than 5 cm.
102, extracting map data of a working area to be operated from the map data;
in the embodiment of the invention, after the map data is obtained, the map data of the operation area to be operated can be extracted from the map data according to the actual operation requirement.
In a preferred embodiment of the present invention, step 102 may further comprise the following sub-steps:
a substep S11 of acquiring a plurality of measurement points marked in the map data;
in one implementation, after determining the job task, the operator may perform dotting on a job land corresponding to the job task through the mapping device according to job requirements to determine a plurality of measurement points (s1, s2, s3, …, sn), and mark the plurality of measurement points in the map data.
In another implementation, the operator may also mark a plurality of measurement points directly in the map data using the naked eye in combination with actual needs (s1, s2, s3, …, sn).
For example, if the area to be worked is a field, a plurality of measurement points may be marked according to the distribution edge of the crop.
The plurality of measurement points are discrete sequential coordinate points, and the coordinate point sequence may be expressed as a point set S ═ S1, S2, S3, …, Sn }.
And a substep S12 of performing image segmentation on the closed region surrounded by the plurality of measurement points to obtain map data of the work region.
Specifically, the measurement points in the point set S are sequentially connected according to a preset sequence to obtain a closed area, and the image of the closed area is cut out to obtain map data of the operation area.
It should be noted that the embodiment of the present invention is not limited to the above-mentioned manner for determining the closed region, and it is all possible for a person skilled in the art to determine in other manners, for example, a plurality of target positions may be selected from the measurement points, and the closed region may be obtained by sequentially connecting the target positions.
Of course, in the working area, the measurement point may be located with respect to an obstacle such as a tree or a utility pole, and the no-fly area may be generated after the connection.
In implementation, each pixel point of the map data has a coordinate, so that the description of the map data of the operation area can be described by using two-dimensional coordinates in a certain order. The sequence may be a clockwise sequence or a counterclockwise sequence, which is not limited in the embodiment of the present invention.
after the work area map data is obtained, feature extraction may be performed on the work area map data to obtain first feature information.
As a preferable example of the embodiment of the present invention, the first feature information may include at least feature points, feature descriptors corresponding to the feature points, and feature dictionaries corresponding to the feature points.
In a preferred embodiment of the present invention, step 103 may comprise the following sub-steps:
a substep S21 of dividing the work area map data into a plurality of tile data;
for example, the image segmentation method may be adopted to segment the map data of the work area into N × M block data.
A substep S22 of extracting a preset number of feature points from each block data, respectively;
specifically, after obtaining N × M block data, t feature points may be extracted from each block data, respectively, to ensure that the feature points are uniformly distributed in the map data of the work area.
In practice, the N, M, t above may satisfy the following condition: a predetermined number of pixels corresponds to one feature point, for example, 100 pixels corresponds to one feature point.
Of course, besides the block, the feature point may also be directly extracted from the work area map data according to a preset feature extraction method, which is not limited in this embodiment of the present invention.
Feature points may refer to points whose location itself has a conventional attribute meaning, such as corner points (Fast corner points, Herries corner points, etc.), intersections, and so on.
In a specific implementation, a computer vision method may be adopted to extract feature points from each block data, for example, for Fast corner points, the extraction manner may include the following processes: and traversing each pixel point in the block data, selecting 16 surrounding pixels by taking the current pixel point as a center and 3 as a radius, sequentially comparing, marking if the gray difference value is greater than a preset threshold value, and taking the current point as a feature point if the number of the marks is greater than 12.
A substep S23 of generating a feature descriptor corresponding to the feature point;
after the feature points are obtained, a feature description is established for the feature points, and the feature description can be called a feature descriptor.
As an example, the feature descriptor may include a SURF (FAST up robust feature) descriptor, an orb (organized FAST and Rotated brief) descriptor, and the like.
In an implementation, the feature point may be described in combination with a feature descriptor, pixel coordinates of the feature point, and geographic location coordinates of the feature point, i.e., the description of the feature point may be denoted as xi={di,ci,fiIn which d isiA feature descriptor of the ith feature point, which is an n-dimensional vector; c. CiPixel coordinates of the ith characteristic point; f. ofiThe geographical position coordinates of the ith feature point can be represented by a three-dimensional vector, namely three-dimensional position data.
The set of all feature points and corresponding feature descriptors in the work area map data may be: x ═ X1,x2,x3,…,xn}。
And a substep S24 of generating a feature dictionary corresponding to the feature points.
After the feature points and the corresponding feature descriptors are obtained, a feature dictionary corresponding to each feature point can be created, and the feature dictionary is used for rapidly detecting the current approximate position of the unmanned aerial vehicle, and further performing high-precision image matching and positioning of the unmanned aerial vehicle.
In particular implementations, a loop detection algorithm may be used to determine the feature dictionary corresponding to each feature point, for example, one loop detection algorithm may include bag of words (bag of words) of DBOW 2.
And 104, performing route planning on the map data of the operation area based on the first characteristic information to obtain route information.
Through steps 101-103, the region to be worked may include a piece of working region map data, a set of feature points and corresponding feature descriptors, and a feature dictionary corresponding to each feature point. Subsequently, a route planning may be performed for the area to be worked.
In specific implementation, a corresponding planning scheme can be selected according to operation requirements, and route planning can be directly performed on map data of an operation area. For example, if the operation requirement is that the operation of the unmanned aerial vehicle completely covers the area to be operated, a snake planning method can be adopted for route planning.
It should be noted that the embodiment of the present invention is not limited to the planning method of the route, and a series of discrete three-dimensional coordinate points (i.e. waypoints) and corresponding attribute information may be used to describe the planning result, i.e. the track information, regardless of the planning method.
In a preferred embodiment of the present invention, step 104 may further include the following sub-steps:
substep S31, performing route planning on the map data of the operation area to obtain a plurality of waypoints;
after the route planning is carried out on the operation area, a plurality of waypoints can be obtained, and the routes of the operation area can be obtained by connecting the plurality of waypoints.
The departure point of the flight path can be specified in the area to be operated according to any requirement, and the determining mode of the departure point is not limited in the embodiment of the invention.
A substep S32 of respectively obtaining three-dimensional coordinate data of the waypoints;
each waypoint may be described as a three-dimensional coordinate datum denoted as fi.
As an example, the three-dimensional coordinate data may include a plane coordinate and a height coordinate.
In a preferred embodiment of the present invention, the sub-step S32 further includes the following sub-steps: acquiring the geographic position coordinates of the pixels corresponding to the waypoints as plane coordinates; acquiring the height of a pixel corresponding to the waypoint; acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning; and taking the height and the sum of the height to the ground as a height coordinate.
In order to realize the functions of positioning and navigation, the distance between two waypoints in the track information cannot be too sparse, and in one implementation mode, the distance between the two waypoints can be determined by the overlapping degree of two images continuously acquired by an airborne camera.
In practice, the above-mentioned overlap may be set to be more than 30%.
In a specific implementation, the relationship between the overlapping degree and the ground altitude of the unmanned aerial vehicle and the attribute of the onboard camera can be expressed as follows:
J=Hwpnp(1-p)/f
wherein J is the time interval between two images shot by the onboard camera, H is the ground height of the unmanned aerial vehicle, and WpIs the size of the pixel, npIs the number of pixels, P is the degree of overlap, and f is the focal length of the onboard camera.
The above-mentioned pixel size and the number of pixels refer to the pixel size and the number of pixels of the vertical and horizontal coordinates of the sensor parallel to the flight direction.
A substep S33 of, for each waypoint, collecting a feature descriptor and a feature dictionary of a region that can be covered by the field angle of the onboard camera with the pixel coordinates of the waypoint as the center, and generating a feature set;
in the embodiment of the invention, in order to meet the navigation requirement of the unmanned aerial vehicle, the characteristic set of the waypoint can be determined.
Specifically, for each waypoint, a feature descriptor and a feature dictionary of an area which can be covered by a field angle FOV of the onboard camera may be collected with a pixel coordinate of the waypoint as a center, and a set of the collected feature descriptor and feature dictionary is recorded as a feature set, which is di ═ xij, tij }, where xij is a feature descriptor of a jth feature point corresponding to the waypoint i, and tij is a feature dictionary of a jth feature point corresponding to the waypoint i.
And a substep S34 of organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint.
After obtaining the three-dimensional coordinate data and the feature set of each waypoint, the three-dimensional coordinate data and the feature set may be organized into attribute information of the waypoint, that is, the description of the waypoint is hi ═ di, fi }.
The track information obtained in step 104 can be described as H ═ { H1, H2, H3, …, hn }.
In the embodiment of the invention, the air route planning can be carried out on the operation area according to the operation area map data to be operated extracted from the map data and the first characteristic information acquired from the operation area map data, and the flight path information is acquired.
Referring to fig. 2, a flow chart of steps of an embodiment of the method of positioning of the present invention is shown, which when applied to the above-described unmanned aerial vehicle, may include the steps of:
as a preferred example of the embodiment of the present invention, the attribute information of the waypoint may include, but is not limited to: the navigation system comprises a feature set associated with a navigation point and three-dimensional coordinate data of each navigation point, wherein the feature set can comprise a plurality of feature points and a first feature dictionary corresponding to each feature point.
For the generation manner of the track information, reference may be made to the method in the embodiment of fig. 1, which is not described herein again.
In implementation, after generating the flight path information according to the embodiment of fig. 1, the flight path information may be stored in a server at the cloud end, a database of the unmanned aerial vehicle, a ground station, a remote controller, and the like.
Therefore, the unmanned aerial vehicle may request the track information from the cloud server, extract the track information from the local database, and receive the track information sent by the ground station or the remote controller, which is not limited in this embodiment of the present invention.
in the embodiment of the invention, after the unmanned aerial vehicle is started, the onboard camera can be started, and the onboard camera is controlled to continuously acquire real-time image data according to the preset interval.
Aiming at the real-time image data acquired by the unmanned aerial vehicle each time, the unmanned aerial vehicle can be compared with the flight path information, so that the real-time position information of the unmanned aerial vehicle is determined according to the flight path information and the real-time image data.
In a preferred embodiment of the present invention, step 203 further comprises the following sub-steps:
a substep S41, judging whether the unmanned aerial vehicle deviates from a route corresponding to the flight path information based on the flight path information and the real-time image data;
firstly, the unmanned aerial vehicle can judge whether the current unmanned aerial vehicle is near a planned route according to the track information and the currently acquired real-time image data, namely whether the current unmanned aerial vehicle deviates from the route corresponding to the track information.
In a preferred embodiment of the present invention, the sub-step S41 further includes the following sub-steps:
substep S411, extracting second feature information from the real-time image data;
as a preferred embodiment of the present invention, the second feature information may include a plurality of second feature points, a second feature dictionary corresponding to each second feature point, a second feature descriptor, and the like.
For the process of extracting the second feature information from the real-time image data, reference may be made to the process of extracting the first feature information in step 103 in the embodiment of fig. 1, which is not described herein again.
Substep S412, calculating a matching degree between the second feature dictionary and each first feature dictionary in the feature set;
after second feature dictionaries corresponding to a plurality of second feature points in the real-time image data are obtained, each second feature dictionary can be matched with each first feature dictionary in the feature set respectively, so that the matching degree of each second feature dictionary with each first feature dictionary in each feature set is calculated.
In the embodiment of the present invention, the method for calculating the matching degree is not limited, and for example, the method for calculating the similarity may be used to calculate the matching degree between the first feature dictionary and the second feature dictionary.
And in the substep S413, if the first feature dictionary with the matching degree larger than the preset threshold does not exist, judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information.
After the matching degree of all the second feature dictionaries and each first feature dictionary in the feature set is calculated, if the matching degree which is larger than a preset threshold value does not exist, the route corresponding to the unmanned aerial vehicle deviation track information can be judged, wherein the deviation route means that the unmanned aerial vehicle deviates from at least one route belt.
A substep S42, if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle;
in a preferred embodiment of the present invention, the sub-step S42 further includes the following sub-steps:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the region determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
If the matching degree is larger than the preset threshold value, the unmanned aerial vehicle can be judged to be on the route corresponding to the track information, namely the unmanned aerial vehicle is judged not to deviate from the route corresponding to the track information. At this time, feature points corresponding to the first feature dictionary with the matching degree greater than the preset threshold value may be determined as matching feature points, and the region determined by the matching feature points is used as temporary positioning information of the unmanned aerial vehicle, where the temporary positioning information is a rough flight position of the unmanned aerial vehicle.
It should be noted that, if the unmanned aerial vehicle flies according to the flight lines sequentially, because the adjacent waypoints to be matched are determined, the rough flight position of the unmanned aerial vehicle can be determined according to the waypoints already flown without using the feature dictionary.
And a substep S43 of determining current position information of the unmanned aerial vehicle based on the temporary positioning information.
After the temporary positioning information of the unmanned aerial vehicle is obtained, the accurate position information of the unmanned aerial vehicle can be obtained according to the rough positioning information.
In a preferred embodiment of the present invention, the sub-step S43 further includes the following sub-steps:
substep S431, determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints;
substep S432, matching the second feature descriptor with the first feature descriptor, and determining a matched feature point sequence;
after the temporary positioning information of the unmanned aerial vehicle is obtained, the adjacent waypoints, namely the neighborhood waypoints, can be determined according to the temporary positioning information, and the first feature descriptors associated with the neighborhood waypoints are obtained.
Subsequently, the second feature descriptors corresponding to the real-time image data can be respectively matched with the first feature descriptors associated with the neighborhood waypoints, and a group of feature point sequences is obtained after matching is completed.
In one embodiment, the sequence of feature points may include two-dimensional coordinates of matched feature points of the real-time image data and three-dimensional coordinates of feature points of the corresponding matched work area map data.
And a substep S433, determining the current position information of the unmanned aerial vehicle based on the matched characteristic point sequence.
In a preferred embodiment of the present invention, the sub-step S433 may further include: and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
In one embodiment, the predetermined positioning algorithm may include, but is not limited to, a PNP (camera pose estimation) algorithm.
In a preferred embodiment of the present invention, after obtaining the current position information of the unmanned aerial vehicle, the method may further include the following steps:
determining the position offset of the current position information and the track information; and correcting the current position information based on the position offset.
In a specific implementation, a position difference between the current position information and a predetermined waypoint position of the flight path information may be calculated as a position offset amount, and the position offset amount is input to the flight controller to correct the current deviation.
In the embodiment of the invention, the unmanned aerial vehicle is roughly positioned by using the feature dictionary, and more accurate current position information is obtained for the following feature descriptor matching and PNP algorithm, so that the positioning and navigation of the unmanned aerial vehicle when no GPS signal or the GPS signal is covered are realized, and the dependence of the unmanned aerial vehicle on the GPS is reduced.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 3, a block diagram of an embodiment of the route generation device of the unmanned aerial vehicle of the present invention is shown, which may include the following modules:
a map data acquisition module 301, configured to acquire map data;
a working area map data acquisition module 302, configured to extract working area map data to be worked from the map data;
a first feature information obtaining module 303, configured to obtain first feature information from the work area map data;
and the route planning module 304 is configured to perform route planning on the map data of the operation area based on the first feature information, so as to obtain route information.
In a preferred embodiment of the present invention, the operation area map data obtaining module 302 further includes the following sub-modules:
a measurement point marking sub-module for acquiring a plurality of measurement points marked in the map data;
and the image cutting sub-module is used for carrying out image cutting on the closed area formed by the plurality of measuring points to obtain the map data of the operation area.
In a preferred embodiment of the present invention, the first feature information at least includes a feature point, a feature descriptor corresponding to the feature point, and a feature dictionary corresponding to the feature point, and the first feature information obtaining module 303 further may include the following sub-modules:
the map segmentation submodule is used for segmenting the map data of the operation area into a plurality of block data;
the characteristic point extraction submodule is used for extracting a preset number of characteristic points from each block data respectively;
the characteristic descriptor generation submodule is used for generating a characteristic descriptor corresponding to the characteristic point;
and the feature dictionary generating submodule is used for generating a feature dictionary corresponding to the feature points.
In a preferred embodiment of the present invention, the unmanned aerial vehicle includes an onboard camera, and the track information includes a waypoint and attribute information of the waypoint; the route planning module 304 may further include sub-modules to:
the planning submodule is used for carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
the three-dimensional coordinate data acquisition submodule is used for respectively acquiring the three-dimensional coordinate data of the waypoints;
the characteristic set acquisition submodule is used for acquiring a characteristic descriptor and a characteristic dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center for each waypoint to generate a characteristic set;
and the organizing submodule is used for organizing the three-dimensional coordinate data and the feature set into the attribute information of the waypoint.
In a preferred embodiment of the present invention, in the track information, the distance between two waypoints is determined by the overlapping degree of two images continuously acquired by the onboard camera.
In a preferred embodiment of the present invention, the three-dimensional coordinate data obtaining sub-module further includes:
the plane coordinate acquisition unit is used for acquiring the geographic position coordinates of the pixels corresponding to the waypoints as plane coordinates;
the height coordinate acquisition unit is used for acquiring the height of the pixel corresponding to the waypoint; acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning; and taking the height and the sum of the height to the ground as a height coordinate.
For the apparatus embodiment of fig. 3, since it is basically similar to the method embodiment described above, the description is simple, and for the relevant points, reference may be made to part of the description of the method embodiment.
Referring to fig. 4, a block diagram of an embodiment of an unmanned aerial vehicle according to the present invention is shown, where the unmanned aerial vehicle may include an onboard camera, and in the embodiment of the present invention, the unmanned aerial vehicle may further include the following modules:
a track information obtaining module 401, configured to obtain track information, where the track information includes attribute information of multiple waypoints;
a real-time image acquisition module 402, configured to acquire real-time image data acquired by the onboard camera at preset intervals;
and a positioning module 403, configured to determine current position information of the unmanned aerial vehicle based on the track information and the real-time image data.
In a preferred embodiment of the present invention, the positioning module 403 may include the following sub-modules:
the flight path deviation judging submodule is used for judging whether the unmanned aerial vehicle deviates from a flight path corresponding to the flight path information or not based on the flight path information and the real-time image data;
the temporary positioning information acquisition submodule is used for acquiring temporary positioning information of the unmanned aerial vehicle if the unmanned aerial vehicle does not deviate from the air route;
and the current position information acquisition submodule is used for determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
In a preferred embodiment of the present invention, the attribute information includes a feature set associated with the waypoint, where the feature set includes a plurality of feature points and a first feature dictionary corresponding to each feature point;
the lane departure judging sub-module may further include the following units:
a second feature information extraction unit configured to extract second feature information from the real-time image data, the second feature information including a second feature dictionary;
the matching degree calculation unit is used for calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and the deviation unit is used for judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information if the first characteristic dictionary with the matching degree larger than the preset threshold does not exist.
In a preferred embodiment of the present invention, the temporary positioning information obtaining sub-module is further configured to:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the region determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
In a preferred embodiment of the present invention, the feature set further includes a first feature descriptor corresponding to each feature point, the second feature information includes a second feature descriptor, and the current location information obtaining sub-module further includes the following units:
the neighborhood waypoint determining unit is used for determining neighborhood waypoints based on the temporary positioning information and acquiring a first feature descriptor associated with the neighborhood waypoints;
the matching unit is used for matching the second feature descriptor with the first feature descriptor and determining a matched feature point sequence;
and the position determining unit is used for determining the current position information of the unmanned aerial vehicle based on the matched characteristic point sequence.
In a preferred embodiment of the present invention, the position determining unit is further configured to:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
In a preferred embodiment of the embodiments of the present invention, the unmanned aerial vehicle further includes the following modules:
the position offset determining module is used for determining the position offset of the current position information and the track information;
and the offset correction module is used for correcting the current position information based on the position offset.
For the aircraft embodiment of fig. 4, since it is substantially similar to the method embodiment described above, the description is relatively simple, and reference may be made to part of the description of the method embodiment for relevant points.
In addition, the embodiment of the present invention also discloses an aircraft, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, wherein the processor implements the steps of the method of the above embodiment when executing the program, and the steps include:
acquiring track information, wherein the track information comprises attribute information of a plurality of waypoints;
acquiring real-time image data acquired by the airborne camera according to a preset interval;
and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
In addition, the embodiment of the invention also discloses a computer readable storage medium, on which a computer program is stored, and the program is executed by a processor to realize the steps of the method of the embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method and the device for generating and positioning the flight path of the unmanned aerial vehicle and the unmanned aerial vehicle are described in detail, a specific example is applied in the text to explain the principle and the implementation mode of the invention, and the description of the embodiment is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (26)
1. A method of generating a course for an unmanned aerial vehicle, the unmanned aerial vehicle comprising an onboard camera, the method comprising:
acquiring map data;
extracting map data of a working area to be operated from the map data;
acquiring first characteristic information from the map data of the operation area; the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a feature dictionary corresponding to the feature points;
performing route planning on the map data of the operation area based on the first characteristic information to obtain route information, wherein the route information comprises route points and attribute information of the route points;
performing route planning on the map data of the operation area based on the first characteristic information to obtain route information, wherein the route planning comprises the following steps:
carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
respectively acquiring three-dimensional coordinate data of the waypoints;
for each waypoint, collecting a feature descriptor and a feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as a center, and generating a feature set;
organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint.
2. The method according to claim 1, wherein the step of extracting the work area map data to be worked from the map data includes:
acquiring a plurality of measurement points marked in the map data;
and carrying out image cutting on the closed area surrounded by the plurality of measuring points to obtain map data of the operation area.
3. The method according to claim 1 or 2, wherein the step of acquiring first feature information from the work area map data includes:
dividing the map data of the operation area into a plurality of block data;
respectively extracting a preset number of feature points from each block data;
generating a feature descriptor corresponding to the feature point;
and generating a feature dictionary corresponding to the feature points.
4. The method according to claim 1, characterized in that in the track information, the distance between two waypoints is determined by the degree of overlap of two images successively acquired by the onboard camera.
5. The method according to claim 1 or 4, wherein the step of respectively acquiring three-dimensional coordinate data of the waypoints comprises:
acquiring the geographic position coordinates of the pixels corresponding to the waypoints as plane coordinates;
acquiring the height of a pixel corresponding to the waypoint;
acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning;
and taking the height and the sum of the height to the ground as a height coordinate.
6. A method of positioning in an unmanned aerial vehicle, the unmanned aerial vehicle including an onboard camera, the method comprising:
acquiring track information, wherein the track information comprises attribute information of a plurality of waypoints;
acquiring real-time image data acquired by the airborne camera according to a preset interval;
determining current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data;
wherein the step of generating the track information comprises:
acquiring map data;
extracting map data of a working area to be operated from the map data;
acquiring first feature information from the map data of the operation area, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
respectively acquiring three-dimensional coordinate data of the waypoints;
for each waypoint, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as a center, and generating a feature set;
organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint.
7. The method of claim 6, wherein the step of determining current position information of the UAV based on the track information and the real-time image data comprises:
judging whether the unmanned aerial vehicle deviates from a route corresponding to the flight path information or not based on the flight path information and the real-time image data;
if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle;
and determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
8. The method of claim 7, wherein the step of determining whether the UAV deviates from a route corresponding to the track information based on the track information and the real-time image data comprises:
extracting second feature information from the real-time image data, wherein the second feature information comprises a second feature dictionary;
calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and if the first characteristic dictionary with the matching degree larger than the preset threshold value does not exist, judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information.
9. The method of claim 8, wherein the step of obtaining temporary positioning information for the UAV comprises:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the region determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
10. The method according to claim 8 or 9, wherein the feature set further comprises a first feature descriptor corresponding to each feature point, the second feature information comprises a second feature descriptor, and the step of determining the current position information of the UAV based on the temporary positioning information comprises:
determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints;
matching the second feature descriptor with the first feature descriptor to determine a matched feature point sequence;
and determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence.
11. The method of claim 10, wherein the step of determining the current position information of the UAV based on the matched sequence of feature points comprises:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
12. The method according to any one of claims 6-11, further comprising:
determining the position offset of the current position information and the track information;
and correcting the current position information based on the position offset.
13. An airline generation apparatus of an unmanned aerial vehicle, the unmanned aerial vehicle including an onboard camera, the apparatus comprising:
the map data acquisition module is used for acquiring map data;
the operation area map data acquisition module is used for extracting operation area map data to be operated from the map data;
the first characteristic information acquisition module is used for acquiring first characteristic information from the map data of the operation area; the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a feature dictionary corresponding to the feature points;
the route planning module is used for planning routes of the map data of the operation area based on the first characteristic information to obtain route information, and the route information comprises route points and attribute information of the route points;
wherein the route planning module comprises:
the planning submodule is used for carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
the three-dimensional coordinate data acquisition submodule is used for respectively acquiring the three-dimensional coordinate data of the waypoints;
the characteristic set acquisition submodule is used for acquiring a characteristic descriptor and a characteristic dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center for each waypoint to generate a characteristic set;
and the organizing submodule is used for organizing the three-dimensional coordinate data and the feature set into the attribute information of the waypoint.
14. The apparatus of claim 13, wherein the work area map data acquisition module comprises:
a measurement point marking sub-module for acquiring a plurality of measurement points marked in the map data;
and the image cutting sub-module is used for carrying out image cutting on the closed area formed by the plurality of measuring points to obtain the map data of the operation area.
15. The apparatus according to claim 13 or 14, wherein the first feature information acquiring module comprises:
the map segmentation submodule is used for segmenting the map data of the operation area into a plurality of block data;
the characteristic point extraction submodule is used for extracting a preset number of characteristic points from each block data respectively;
the characteristic descriptor generation submodule is used for generating a characteristic descriptor corresponding to the characteristic point;
and the feature dictionary generating submodule is used for generating a feature dictionary corresponding to the feature points.
16. The apparatus according to claim 13, wherein in the track information, the distance between two waypoints is determined by the degree of overlap of two images continuously acquired by the onboard camera.
17. The apparatus of claim 13 or 16, wherein the three-dimensional coordinate data acquisition sub-module comprises:
the plane coordinate acquisition unit is used for acquiring the geographic position coordinates of the pixels corresponding to the waypoints as plane coordinates;
the height coordinate acquisition unit is used for acquiring the height of the pixel corresponding to the waypoint; acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning; and taking the height and the sum of the height to the ground as a height coordinate.
18. An unmanned aerial vehicle, the unmanned aerial vehicle comprising an onboard camera, the unmanned aerial vehicle further comprising:
the flight path information acquisition module is used for acquiring flight path information, and the flight path information comprises attribute information of a plurality of flight points;
the real-time image acquisition module is used for acquiring real-time image data acquired by the airborne camera according to preset intervals;
the positioning module is used for determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data;
the map data acquisition module is used for acquiring map data;
the operation area map data acquisition module is used for extracting operation area map data to be operated from the map data;
the first characteristic information acquisition module is used for acquiring first characteristic information from the map data of the operation area; the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
the route planning module comprises a planning submodule, a three-dimensional coordinate data acquisition submodule, a feature set acquisition submodule and an organization submodule, wherein,
the planning submodule is used for carrying out route planning on the map data of the operation area to obtain a plurality of waypoints;
the three-dimensional coordinate data acquisition submodule is used for respectively acquiring the three-dimensional coordinate data of the waypoints;
the characteristic set acquisition submodule is used for acquiring a characteristic descriptor and a first characteristic dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center for each waypoint to generate a characteristic set;
and the organizing submodule is used for organizing the three-dimensional coordinate data and the feature set into the attribute information of the waypoint.
19. The UAV of claim 18 wherein the positioning module comprises:
the flight path deviation judging submodule is used for judging whether the unmanned aerial vehicle deviates from a flight path corresponding to the flight path information or not based on the flight path information and the real-time image data;
the temporary positioning information acquisition submodule is used for acquiring temporary positioning information of the unmanned aerial vehicle if the unmanned aerial vehicle does not deviate from the air route;
and the current position information acquisition submodule is used for determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
20. The UAV of claim 19, wherein the attribute information comprises a feature set associated with the waypoint, the feature set comprising a plurality of feature points and a first feature dictionary corresponding to each feature point;
the lane departure judgment submodule includes:
a second feature information extraction unit configured to extract second feature information from the real-time image data, the second feature information including a second feature dictionary;
the matching degree calculation unit is used for calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and the deviation unit is used for judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information if the first characteristic dictionary with the matching degree larger than the preset threshold does not exist.
21. The UAV of claim 20 wherein the temporary positioning information acquisition sub-module is further configured to:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the region determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
22. The unmanned aerial vehicle of claim 20 or 21, wherein the feature set further comprises a first feature descriptor corresponding to each feature point, the second feature information comprises a second feature descriptor, and the current position information acquisition sub-module comprises:
the neighborhood waypoint determining unit is used for determining neighborhood waypoints based on the temporary positioning information and acquiring a first feature descriptor associated with the neighborhood waypoints;
the matching unit is used for matching the second feature descriptor with the first feature descriptor and determining a matched feature point sequence;
and the position determining unit is used for determining the current position information of the unmanned aerial vehicle based on the matched characteristic point sequence.
23. The UAV of claim 22 wherein the position determination unit is further configured to:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
24. The unmanned aerial vehicle of any of claims 18-23, further comprising:
the position offset determining module is used for determining the position offset of the current position information and the track information;
and the offset correction module is used for correcting the current position information based on the position offset.
25. An aircraft comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 and/or 6 to 12 when executing the program.
26. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5 and/or 6 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710641017.9A CN109324337B (en) | 2017-07-31 | 2017-07-31 | Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710641017.9A CN109324337B (en) | 2017-07-31 | 2017-07-31 | Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109324337A CN109324337A (en) | 2019-02-12 |
CN109324337B true CN109324337B (en) | 2022-01-14 |
Family
ID=65245026
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710641017.9A Active CN109324337B (en) | 2017-07-31 | 2017-07-31 | Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109324337B (en) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111982096B (en) * | 2019-05-23 | 2022-09-13 | 广州极飞科技股份有限公司 | Operation path generation method and device and unmanned aerial vehicle |
WO2020237471A1 (en) * | 2019-05-27 | 2020-12-03 | 深圳市大疆创新科技有限公司 | Flight route generation method, terminal and unmanned aerial vehicle |
CN110244765B (en) * | 2019-06-27 | 2023-02-28 | 深圳市道通智能航空技术股份有限公司 | Aircraft route track generation method and device, unmanned aerial vehicle and storage medium |
CN110262556A (en) * | 2019-07-12 | 2019-09-20 | 黑梭智慧技术(北京)有限公司 | Express Logistics unmanned vehicle route design method and apparatus |
CN110362102B (en) * | 2019-07-25 | 2022-07-29 | 深圳市道通智能航空技术股份有限公司 | Method, device and system for generating unmanned aerial vehicle route |
CN110428451B (en) * | 2019-08-15 | 2021-09-24 | 中国地质大学(北京) | Operation method for matching topographic map with GPS equipment by utilizing GPS track |
WO2021035613A1 (en) * | 2019-08-29 | 2021-03-04 | 深圳市大疆创新科技有限公司 | Path planning method and path planning device for spraying operation |
CN110515393B (en) * | 2019-10-24 | 2020-03-31 | 南京国器智能装备有限公司 | Method, device and system for real-time anti-drift correction of agricultural and forestry spraying of unmanned aerial vehicle |
CN112313476A (en) * | 2019-11-05 | 2021-02-02 | 深圳市大疆创新科技有限公司 | Air route planning method and device for unmanned aerial vehicle |
CN110716586A (en) * | 2019-11-14 | 2020-01-21 | 广州极飞科技有限公司 | Photographing control method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium |
WO2021127941A1 (en) * | 2019-12-23 | 2021-07-01 | 深圳市大疆创新科技有限公司 | Route planning method, unmanned aerial vehicle, control terminal, and computer-readable storage medium |
CN113448340B (en) * | 2020-03-27 | 2022-12-16 | 北京三快在线科技有限公司 | Unmanned aerial vehicle path planning method and device, unmanned aerial vehicle and storage medium |
CN113741507A (en) * | 2020-05-29 | 2021-12-03 | 广州极飞科技股份有限公司 | Global path trajectory planning method and device for unmanned aerial vehicle, unmanned aerial vehicle and equipment |
CN113741413B (en) * | 2020-05-29 | 2022-11-08 | 广州极飞科技股份有限公司 | Operation method of unmanned equipment, unmanned equipment and storage medium |
WO2021253269A1 (en) * | 2020-06-17 | 2021-12-23 | 深圳市大疆创新科技有限公司 | Information processing method and apparatus, program, storage medium, and computing processing device |
CN112509381B (en) * | 2020-10-16 | 2022-03-11 | 广州飞图信息科技有限公司 | Visual display method and device for unmanned aerial vehicle route signal blind area |
CN112362065B (en) * | 2020-11-19 | 2022-08-16 | 广州极飞科技股份有限公司 | Obstacle detouring track planning method and device, storage medium, control unit and equipment |
CN112822741B (en) * | 2020-12-30 | 2023-02-10 | 广州极飞科技股份有限公司 | Communication mode switching method and device, electronic equipment and storage medium |
CN113917939B (en) * | 2021-10-09 | 2022-09-06 | 广东汇天航空航天科技有限公司 | Positioning and navigation method and system of aircraft and computing equipment |
CN114320862A (en) * | 2021-11-23 | 2022-04-12 | 国网浙江省电力有限公司嘉兴供电公司 | Energy-saving optimization method for air compressor |
CN117111641B (en) * | 2023-10-25 | 2024-01-19 | 天津云圣智能科技有限责任公司 | Unmanned aerial vehicle route deviation rectifying method, device, equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102582826A (en) * | 2011-01-06 | 2012-07-18 | 佛山市安尔康姆航拍科技有限公司 | Driving method and system of four-rotor-wing unmanned flight vehicle |
CN107690840B (en) * | 2009-06-24 | 2013-07-31 | 中国科学院自动化研究所 | Unmanned plane vision auxiliary navigation method and system |
CN103411609A (en) * | 2013-07-18 | 2013-11-27 | 北京航天自动控制研究所 | Online composition based aircraft return route programming method |
CN104615146A (en) * | 2015-02-05 | 2015-05-13 | 广州快飞计算机科技有限公司 | Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal |
CN104679011A (en) * | 2015-01-30 | 2015-06-03 | 南京航空航天大学 | Image matching navigation method based on stable branch characteristic point |
CN106502265A (en) * | 2016-10-26 | 2017-03-15 | 广州极飞科技有限公司 | A kind of airline generation method and apparatus of unmanned vehicle |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105222788B (en) * | 2015-09-30 | 2018-07-06 | 清华大学 | The automatic correcting method of the matched aircraft Route Offset error of feature based |
CN105843223B (en) * | 2016-03-23 | 2018-11-20 | 东南大学 | A kind of mobile robot three-dimensional based on space bag of words builds figure and barrier-avoiding method |
-
2017
- 2017-07-31 CN CN201710641017.9A patent/CN109324337B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107690840B (en) * | 2009-06-24 | 2013-07-31 | 中国科学院自动化研究所 | Unmanned plane vision auxiliary navigation method and system |
CN102582826A (en) * | 2011-01-06 | 2012-07-18 | 佛山市安尔康姆航拍科技有限公司 | Driving method and system of four-rotor-wing unmanned flight vehicle |
CN103411609A (en) * | 2013-07-18 | 2013-11-27 | 北京航天自动控制研究所 | Online composition based aircraft return route programming method |
CN104679011A (en) * | 2015-01-30 | 2015-06-03 | 南京航空航天大学 | Image matching navigation method based on stable branch characteristic point |
CN104615146A (en) * | 2015-02-05 | 2015-05-13 | 广州快飞计算机科技有限公司 | Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal |
CN106502265A (en) * | 2016-10-26 | 2017-03-15 | 广州极飞科技有限公司 | A kind of airline generation method and apparatus of unmanned vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN109324337A (en) | 2019-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109324337B (en) | Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle | |
KR102273559B1 (en) | Method, apparatus, and computer readable storage medium for updating electronic map | |
CN106774431B (en) | Method and device for planning air route of surveying and mapping unmanned aerial vehicle | |
CN108694882B (en) | Method, device and equipment for labeling map | |
CN103411609B (en) | A kind of aircraft return route planing method based on online composition | |
US10970924B2 (en) | Reconstruction of a scene from a moving camera | |
JP2022520019A (en) | Image processing methods, equipment, mobile platforms, programs | |
CN110617821B (en) | Positioning method, positioning device and storage medium | |
EP3106832B1 (en) | Cross spectral feature correlation for navigational adjustment | |
KR102664900B1 (en) | Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof | |
US9857178B2 (en) | Method for position and location detection by means of virtual reference images | |
US9721158B2 (en) | 3D terrain mapping system and method | |
CN111829532B (en) | Aircraft repositioning system and method | |
EP2503510A1 (en) | Wide baseline feature matching using collobrative navigation and digital terrain elevation data constraints | |
KR101444685B1 (en) | Method and Apparatus for Determining Position and Attitude of Vehicle by Image based Multi-sensor Data | |
KR20200032776A (en) | System for information fusion among multiple sensor platforms | |
CN110243364A (en) | Unmanned plane course determines method, apparatus, unmanned plane and storage medium | |
CN111104861B (en) | Method and apparatus for determining wire position and storage medium | |
Schleiss et al. | VPAIR--Aerial Visual Place Recognition and Localization in Large-scale Outdoor Environments | |
JP6828448B2 (en) | Information processing equipment, information processing systems, information processing methods, and information processing programs | |
US11461944B2 (en) | Region clipping method and recording medium storing region clipping program | |
CN113312435A (en) | High-precision map updating method and device | |
WO2020118623A1 (en) | Method and system for generating an environment model for positioning | |
CN109901589B (en) | Mobile robot control method and device | |
Javanmardi et al. | 3D building map reconstruction in dense urban areas by integrating airborne laser point cloud with 2D boundary map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province Applicant after: XAG Co., Ltd. Address before: No.3a01, No.1 Sicheng Road, Gaotang Software Park, Tianhe District, Guangzhou, Guangdong 510000 Applicant before: Guangzhou Xaircraft Technology Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |