CN114490815A - Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on terrain visibility - Google Patents

Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on terrain visibility Download PDF

Info

Publication number
CN114490815A
CN114490815A CN202210064053.4A CN202210064053A CN114490815A CN 114490815 A CN114490815 A CN 114490815A CN 202210064053 A CN202210064053 A CN 202210064053A CN 114490815 A CN114490815 A CN 114490815A
Authority
CN
China
Prior art keywords
point
aerial vehicle
unmanned aerial
task
task load
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210064053.4A
Other languages
Chinese (zh)
Other versions
CN114490815B (en
Inventor
郭庆
许洁
谢文俊
张鹏
晁爱农
樊涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Air Force Engineering University of PLA
Original Assignee
Air Force Engineering University of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Air Force Engineering University of PLA filed Critical Air Force Engineering University of PLA
Priority to CN202210064053.4A priority Critical patent/CN114490815B/en
Publication of CN114490815A publication Critical patent/CN114490815A/en
Application granted granted Critical
Publication of CN114490815B publication Critical patent/CN114490815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q50/40
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

A unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on terrain visibility specifically comprises the following steps: preparing data; calculating a scanning boundary; processing a scanning area; performing through-looking calculation on small grid areas; merging and calculating; and calculating the scout coverage rate. The invention can realize the support of the air route planning on the non-planar terrain; the method can realize the quantitative description of the reconnaissance coverage rate of the established route, and can avoid the errors of the manual and subjective judgment of an operator.

Description

Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on terrain visibility
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on elevation terrain through-vision analysis.
Background
The unmanned aerial vehicle task planning comprises air route planning, task load planning, link planning, simulation deduction and the like, and the planning result is used as a main basis for the unmanned aerial vehicle system to execute a task process. In general, the task planning process is to plan the route of the unmanned aerial vehicle based on the input task area and the unmanned aerial vehicle take-off and landing airport; further planning task load and links on the basis; and then, carrying out deduction verification on the planning result through simulation deduction so as to further optimize the planning result. The route planning and the task load planning have a dependency relationship with each other: on one hand, the task load planning needs to be performed by combining planned task area route information and tasks to be completed to set working modes and parameters of the task load on each route section, and on the other hand, the result of the task load planning is used as the input of task route modification and influences the efficiency of task execution of the unmanned aerial vehicle. The judgment standard of the unmanned aerial vehicle task execution completion degree is mainly used for calculating the coverage rate of a set reconnaissance area. In the traditional task load reconnaissance coverage rate calculation, a task area and an area to be reconnaissance are equivalent to a two-dimensional plane, and then the area ratio of a reconnaissance area to all reconnaissance areas is calculated. On one hand, the method is low in calculation accuracy; on the other hand, the regional reconnaissance result with terrain shielding conditions in mountains, hills and the like has larger deviation with the actual result, and challenges are brought to the unmanned aerial vehicle to execute tasks under the terrain of the type.
At present, a typical method for calculating the mission load reconnaissance coverage rate of an unmanned aerial vehicle in a mission area stage mainly comprises three aspects:
1. calculating the turning radius of the unmanned aerial vehicle based on the flight performance of the unmanned aerial vehicle, determining the minimum distance of searching air routes in a task area, and simultaneously determining an entry point and an exit point of the task area and the type of the air routes;
2. determining a single scanning coverage range based on the height of the unmanned aerial vehicle, the task load attitude and the work detection range of the task load;
3. and calculating the ratio of the area of the scanning coverage area to the area of the task area so as to obtain the scouting coverage rate.
The above mode mainly combines the flight performance of the unmanned aerial vehicle, the task load detection performance, the threat area and the specific state of the task area to carry out comprehensive analysis. Although the method can also plan the types of the air routes such as a spiral type, a snow sweeping type, an 8-shaped type and a grating type according to the task characteristics of the aircraft, the method neglects the shielding of different terrains on the task load reconnaissance ground target, so that the target to be detected may be missed in calculation under certain conditions. The method can basically meet the use requirements for plain terrains, but for most of non-plain terrains, detection blind areas of reconnaissance areas exist by using the method, and the basic use requirements cannot be met, so that other means need to be expanded for promotion. The method for performing the through-vision task load reconnaissance coverage rate calculation based on the terrain elevation can be suitable for most terrains, and provides basic conditions for accurate calculation of the reconnaissance coverage rate.
The task load reconnaissance coverage rate is calculated based on the thought, and the method is based on three aspects:
1. based on the unmanned aerial vehicle performance calculation aspect: obtaining the maximum length of flight paths in a task area based on the performance, fuel oil, flying speed and the like of the unmanned aerial vehicle, and simultaneously determining the minimum distance of flight paths in the task area by the flight turning radius of the unmanned aerial vehicle so as to determine the specific flight path in the task area;
2. on the basis of task load performance calculation aspect: calculating the accommodating width of the task load in single scanning by combining the earth curvature based on the height of the unmanned aerial vehicle, the task load attitude and the task load reconnaissance range;
3. in terms of calculating coverage based on terrain elevation: and calculating the sight-through and non-sight areas in each scanning range by utilizing the sight-through analysis of the terrain elevation data, calculating the reconnaissance range of the complete air route, and further calculating the reconnaissance coverage rate.
In the technical means, unmanned aerial vehicle performance calculation is a basis, the basic direction of the air route is determined by selecting the type of the air route in a task area, on one hand, the theoretical coverage range of single scanning is determined by task load performance calculation, and on the other hand, the direction is input into the air route for reasonable adjustment of the air route; and the calculation of the scout coverage rate is the visual judgment of whether the planning result meets the requirements or whether the planning result is optimal.
Under the current situation, the flight height of the unmanned aerial vehicle, the attitude of the task load, the detection distance of the task load and the angle range are mainly considered in the calculation of the scanning coverage rate of the task area. The specific calculation is carried out by processing the ground in a mode of equivalent to a plane, and the reconnaissance coverage rate in the mission area can be calculated to a certain extent by the mode. However, the following problems mainly exist in using the technology:
1. the situation of terrain occlusion is not considered, and a large number of detection blind areas exist in non-plain terrain task areas such as mountainous areas and hills;
2. the ground is equivalent to a plane, the ground threat zone is modeled as the plane, the influence of the earth region is not considered, and the estimation of the flyable region of the unmanned aerial vehicle is too conservative, so that the planning result meeting the task requirement cannot be obtained due to the fact that the flyable region has no condition under various conditions.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a method for calculating the mission load reconnaissance coverage rate of an unmanned aerial vehicle based on terrain visibility, which comprises the following steps:
through the through-sight analysis carried out by introducing terrain elevation data, the accurate calculation of the task load reconnaissance coverage area and the coverage rate under the condition of a flight path in a task area is carried out, and the technology is suitable for the following conditions:
1) when the unmanned aerial vehicle executes the reconnaissance mission, the unmanned aerial vehicle is in a horizontal flight state, namely the unmanned aerial vehicle is in a state that the pitching and rolling are both 0 degree;
2) the landform of the task area can be either a mountain landform or a plain landform, but the elevation data of the area exists;
3) the conditions of ground buildings, vegetation and other ground surface coverings are not considered;
4) the condition that optical task loads such as the photoelectric pod EO and the like are shielded by cloud layers, rain and snow under different weather conditions is not considered;
5) the situation that the complex electromagnetic environment generates interference on task loads such as radars and the like is not considered;
(1) data preparation
The stage obtains task area data, task load parameters and state data and route point data which are formed based on a top-layer task target; performing two-dimensional calibration on the data to prepare data for the next calculation, specifically including obtaining unmanned aerial vehicle waypoints, task areas, task load state data and task load performance parameters; in the mission planning phase, the known data is as follows:
a) longitude and latitude altitude (B) of unmanned plane position P under WGS84 coordinate system1,L1,H):
b) The pitch angle theta of the center of the visual axis of the task load of the unmanned aerial vehicle is considered as a set parameter by an operator, the value range is 0-90 degrees, the task load is horizontally 0 degrees and vertically 90 degrees;
c) the unmanned aerial vehicle course angle psi is obtained in real time during calculation;
d) the drift angle gamma of the unmanned aerial vehicle is obtained in real time during calculation;
e) unmanned aerial vehicle task load field angle horizontal size VhAnd a vertical size VvThe two values are parameters of the task load and can be conveniently calculated according to the working mode selected by an operator and the set parameters; near-range pitch angle of unmanned aerial vehicle under current state of task load
Figure BSA0000264391230000041
f) Remote pitch angle of unmanned aerial vehicle under task load current state
Figure BSA0000264391230000042
g) Unmanned aerial vehicle task load limit pitch angle thetamaxThe value is the inherent attribute of the task load, the task load manual is inquired for confirmation, and the value range is 0-90 degrees;
h) the maximum acting distance of the unmanned aerial vehicle task load is LmaxThe value is the inherent attribute of the task load, and the task load manual is inquired for confirmation;
i) the earth is approximately a sphere, and the radius R of the sphere is 6371000 and the unit of the radius is meter;
(2) scan boundary calculation
In the stage, scanning boundary points are calculated based on the airway point data, the task load parameters and the state data; the calculation process considers the influence of the earth curvature, but does not consider the influence of the terrain elevation; the reconnaissance coverage rate calculation is applied to a task pre-planning stage, the task load is ideally used in a flat flight stage of the unmanned aerial vehicle, the pitching and rolling in the attitude of the unmanned aerial vehicle are 0 degrees in the stage, and the task load preset mode is a left side view or a right side view; taking the action range of the task load on the ground signal as a fan as a basic assumption to obtain a reconnaissance coverage range of one-time sampling; order to
The point P is the current position of the unmanned aerial vehicle;
POprojecting it on the ground;
the point A is the intersection point of the projection of the left boundary of the task load view field and the near boundary of the task load detection range on the ground;
the point B is the intersection point of the left boundary of the task load view field and the far boundary of the task load detection range projected on the ground;
the point C is an intersection point of the right boundary of the task load view field and the far boundary of the task load detection range projected on the ground;
the point D is the intersection point of the projection of the right boundary of the task load view field and the near boundary of the task load detection range on the ground;
the AD arc line is a near boundary of a scanning range of the current position of the unmanned aerial vehicle task load;
the BC arc line is a far boundary of a scanning range of the current position of the unmanned aerial vehicle task load;
the M point is an intersection point of the projection of the central line of the visual axis of the task load and the near boundary of the detection range of the task load on the ground;
the N point is an intersection point of the central line of the visual axis of the task load and the projection of the remote boundary of the detection range of the task load on the ground;
the MN connecting line is a task load view field central line and is perpendicular to the flight direction of the unmanned aerial vehicle, namely, the included angle of 90 degrees with the flight direction is formed;
A. the longitude and latitude of the point B, C, D, M, N are obtained by setting parameters such as the height of the unmanned aerial vehicle, the maximum acting slant distance of the task load and the maximum pitch angle of the task load; the specific calculation method is as follows:
firstly, calculating the latitude and longitude of M points
a) Central angle calculation
To calculate more accurately, the earth needs to be approximated to be a sphere, and the curvature of the earth is considered, and the projection point P from the unmanned aerial vehicle position P and the geocentric O, P point to the spherical surface is knownONear-boundary pitch angle theta of task load1equal-OPM, earth radius R, flight height H and maximum acting distance L of mission loadmaxUltimate pitch angle theta of mission loadmaxThe intersection point M 'of the ray emitted by the unmanned aerial vehicle and the spherical surface at the maximum pitch angle, and the tangent point N' of the ray emitted by the unmanned aerial vehicle and the spherical surface; according to the cosine theorem, the first central angle theta is preferentially calculated3Equal angle MOP and second central angle
Figure BSA0000264391230000063
The method comprises the following specific steps:
i. calculating maximum acting slant distance in task load visual distance range of task load
Figure BSA0000264391230000061
Calculating minimum pitch angle OPN'
Figure BSA0000264391230000062
Calculating the shortest slope distance PM' which can be reached by the task load, and introducing parameters according to the cosine theorem, wherein the formula is as follows (3):
PM′2+(R+H)2-2*PM′*(R+H)*cos(90°-θmax)=R2 (3)
calculating to obtain PM';
iv, checking the near-range pitch angle theta of the task load1
To ensure validity of the calculation, i.e. the calculated maximum pitch angle θ1Within a reasonable value range, checking is required; if theta is greater than theta1Less than the minimum value of pitch angle < OPN1Is equal to OPN', if the value is larger than the physical maximum pitch value theta of the task loadmaxInput through the top layer or preset, then1=θmaxThe value is equal to OPM', otherwise, the value is unchanged; in this way, the task load is controlled to be within the effective and available range;
v. checking the maximum action distance Lmax
If the maximum acting distance L of the task loadmaxGreater than the maximum slope PN', then LmaxPN', if the maximum distance L of the mission loadmaxLess than the minimum pitch PM', then LmaxPM', otherwise LmaxThe value is unchanged; the effective maximum acting distance of the task load is controlled to be within the available range in the mode;
calculating a first central angle θ3PM is a preset maximum pitch angle thetamaxCorresponding slope distance;
the PM value is calculated according to the cosine theorem of:
PM2+(R+H)2-2*PM*(R+H)*cos(90°-θ1)=R2 (4)
again using the cosine theorem as follows:
R2+(R+H)2-2*R*(R+H)*cos(θ3)=PM2 (5)
calculate theta3
Figure BSA0000264391230000071
vii. same reason, because LmaxDirectly calculating the first point corresponding to the N and P points by using the cosine theorem for the known value
Two central angles
Figure BSA0000264391230000072
Figure BSA0000264391230000073
Comparison of θ3And
Figure BSA0000264391230000074
a value of, e.g.Fruit theta3Is greater than
Figure BSA0000264391230000075
Exchange theta3And
Figure BSA0000264391230000076
a value of (d); otherwise, do not do anything
(ii) a change;
b) m Point latitude and longitude calculation
In calculating theta3And
Figure BSA0000264391230000077
after the value is obtained, further calculating the longitude and latitude of the M point; at the moment, the position of the unmanned aerial vehicle is known to be in the spherical orthographic projection point POLongitude, latitude, course angle and drift angle (L)1,B1,ψ,γ),POCenter angle θ from point to point M3Q is north pole, POM, Q form a spherical triangle, B1、B2、L1、L2Are respectively POLatitude and longitude coordinates of the M two points; calculating the longitude and latitude coordinates of the M points by using a spherical triangle calculation formula;
for convenience of calculation, the spherical triangle needs to be simplified;
wherein: q, P0The arc lengths of the spherical triangle corresponding to M are a, B and C respectively, and A ', B ' and C ' are angles on the spherical surface; angle A' is angle P0For brevity of QM, angle B' is angle QP0For short, M, and angle C' is angle QMP0A is the arc length P on the circle passing through the center of the sphere0M, b is the arc length QM on the center circle of the sphere, c is the arc length P on the center circle of the sphere0Q;
In a spherical triangle P0In MQ, P is known0Latitude of point B1Longitude L1Heading angle psi, drift angle gamma, central angle P0OM, calculating the latitude B of the target point M2And longitude L2
i. Calculate M Point longitude L2
Known angle A' is L2-L1,c=∠P0OQ=90-B1,b=∠MOQ=90-B2,a=∠P0OM; a. b and c are spherical surfaces with the arc length of a spherical triangle and the radius of a unit length, and the arc length is equal to the angle, so the arc length is expressed by the angle;
since PM is perpendicular to the lane direction, < B' > 90 ° + ψ + γ;
according to the spherical sine formula:
Figure BSA0000264391230000081
substituting known conditions to obtain:
sin a sin B′=sin(90°-B2)sin A′ (8)
according to the spherical triangle cotangent formula: cot a sin c ═ cot a ' sin B ' + cos B ' cos c, substituting known conditions, yields:
cot a·sin(90°-B1)=cot(A′)·sin B′+cos B′·cos(90°-B1) (9)
and (3) after simplification:
Figure BSA0000264391230000082
i.e. the longitude L can be calculated2
ii, calculating the latitude B of the M point2
At the point of calculating M latitude B2Before checking the B' first,
if B' is zero, it indicates that the target point M and the point P are at the same latitude, B2=B1+∠P0OM;
If the value of B' is not zero, then it is necessary to determine B by determining the quadrant2A value of (d); the calculation formula is as follows:
cos B2=cos a·sin B1+sin a·cos B1 (11)
converted by equation (7):
Figure BSA0000264391230000083
judgment of L2-L1If L is a value within the range of2-L1Sin B if greater than zero2The value is not changed, otherwise sin B2The value of (A) is negative;
determination of cos B2If cos B2B is in the first quadrant and the fourth quadrant if the value is larger than or equal to zero, otherwise, b is in the second quadrant and the third quadrant; on the basis of the above, by judging sin B2Determining the quadrant of b in the value range;
when cos B2Is greater than zero and sin B2Greater than or equal to zero, then b is in the first quadrant
B2=90°-csc(sin B2) (13)
When cos B2Is greater than or equal to zero and sin B2Less than zero, then b is in the fourth quadrant
B2=90°+csc(sin B2) (14)
When cos B2Is less than zero and sin B2Greater than or equal to zero, then b is in the second quadrant
B2=90°+csc(sin B2) (15)
When cos B2Is less than zero and sin B2Less than zero, b is in the third quadrant
B2=90°-csc(sin B2) (16)
Therefore, the longitude and latitude of the M point are calculated; the longitude and latitude of N, A, B, C, D can be calculated by repeating the calculation steps;
(3) processing a scanning area;
the scanning area is processed by adopting a rasterization method;
1) terrain elevation acquisition
After the longitude and latitude of each point under the influence of the curvature of the earth are obtained, the altitude of each point A, B, C, D, M, N can be determined after the altitude data is brought by the longitude and latitude;
2) scanning range rasterization
Rasterizing an area formed by A, B, C, D four points, equally dividing the area in an AD direction by m, equally dividing the area in an AB direction by n to obtain m x n sub-quadrilateral areas, storing the central points of the sub-quadrilateral areas, and preparing for being used as a perspective sampling point;
the parameters used for rasterization are as follows:
m: the azimuth discretization equally dividing coefficient, m is an integer, the minimum value is 1, the smaller the value is, the coarser the subdivision granularity is, the larger the result deviation is, but the higher the calculation efficiency is, and the setting is input by a planner;
n: the distance direction discretization coefficient is set in the same direction, and m and n are not necessarily equal;
③ m × n: detecting discrete points of a coverage range by a primary task load;
d: azimuth minimum subdivision distance, d ═ LADAngle m, wherein LADA, D is a line connecting the horizontal projection points;
(4) grid small region visibility calculation
Performing the visibility calculation of each grid small area by using the terrain elevation data, the current position data of the unmanned aerial vehicle and the rasterized data, namely obtaining the visibility result of each small area; during the sight-through analysis, whether two points are visible or not is calculated and judged by combining the elevation terrain according to the current position of the unmanned aerial vehicle and the position of a target point; the unmanned aerial vehicle flight profile cuts a terrain longitudinally to obtain a terrain profile, the real-time position of the unmanned aerial vehicle is connected with a target point, if the unmanned aerial vehicle is crossed with the terrain profile, the unmanned aerial vehicle is shown to be invisible from the current point and the target, otherwise, the unmanned aerial vehicle is shown to be visible;
whether the task load of the unmanned aerial vehicle is in full view of the ground target is the same as the judgment basis of whether the ground target is in full view of the unmanned aerial vehicle; for convenience of calculation, a ground target is used as a viewpoint-V point, the detection distance of the unmanned aerial vehicle task load is used as an action range, namely the detection distance is converted into a ground distance, the boundary position of the unmanned aerial vehicle task load is used as a target point-T point, and a certain direction is selected firstly; then analyzing the intersection point by point from the viewpoint to the target point; wherein F (x)i,yi)、G(xi+1,yi+1) …, the slope α of the visual line VT is calculated by the following equation:
Figure BSA0000264391230000101
wherein:
ZTis T (x)T,yT) The height of the point;
ZVis V (x)V,yV) The height of the point;
Figure BSA0000264391230000111
the horizontal distance from the point T to the point V;
a coordinate system is established by taking the eastward reference horizontal plane as an X axis and the upward vertical V-point plumb as a Y axis, and the included angle beta between the connecting line of the viewpoint V and each intersection point and the horizontal planeiThe slope is as follows:
Figure BSA0000264391230000112
wherein:
Ziis F (x)i,yi) The height of the point;
ZVis V (x)V,yV) The height of the point;
Figure BSA0000264391230000113
is the horizontal distance from point F to point V;
the visibility is judged by comparing tan alpha and tan betaiThe value of (c) completes:
if tan betaiIf the alpha is larger than tan alpha, the visibility is not allowed, and the calculation is finished;
if tan betaiIf tan alpha is not greater, F point is looked through, and the next point G (x) is calculatedi+1,yi+1) (ii) a If the system can be pushed to the target point T all the time, the viewpoint and the target point can be seen through;
completing grid small-area communication calculation based on route point data, route points and other data to be primary sampling calculation according to a formula (17), a formula (18) and the communication judgment calculation mode, and after completing the primary sampling calculation and obtaining a result, performing iterative calculation according to the direction and the distance of grid parameters and the like, namely obtaining communication and non-communication grids in a primary sampling coverage range;
(5) merge calculations
Performing grid small-area visibility calculation of each visibility calculation point based on the data of the waypoint data, the waypoint and the like, and merging visibility results to obtain a visibility and non-visibility grid diagram; in addition, the step also synchronously calculates the combination result of the scanning areas; specifically, after one-time sampling calculation is completed, a perspective and non-perspective grid image is obtained, and by analogy, the scanning areas of all route points are calculated according to route point data and segmentation intervals, the perspective and non-perspective grid images are obtained, and merging calculation is carried out; setting the scanning areas of all airline points as U1, the invisible areas as U2, the intersection of the scanning area and the mission area as U3, and the intersection of the invisible areas and the mission area as U4; the concrete algorithm steps of the combination calculation are as follows:
firstly, making waypoints 1 and waypoints 2 represent any 2 waypoints on a mission air route of the unmanned aerial vehicle, and analogizing the calculation of other waypoints; according to sampling intervals, sliding out all the interpolation waypoints between the waypoint 1 and the waypoint 2 to obtain scanning area data generated by all the waypoints, wherein the scanning area is a plurality of sectors, and each sector is scanning area data generated by a certain waypoint between the waypoint 1 and the waypoint 2;
secondly, arranging and storing boundary points of the polygonal area in a counterclockwise direction;
combining the sector areas to obtain polygons formed by the boundary points of the scanning areas, traversing all polygon areas formed by the scanning areas, and combining the currently combined areas and each polygon in the polygon areas one by one;
fourthly, calculating the data of the boundary points of the polygon areas after union set based on the traversing and the result of solving union set one by one;
(6) calculating scout coverage
Recursion is carried out along the direction of the course, and the reconnaissance coverage range of the whole course can be obtained according to the intersection U3 of the communication area and the task area U4 calculated in the steps I-II;
order:
P1the unmanned aerial vehicle is at the point when the unmanned aerial vehicle is calculated for the first time, and the point is also at the point of the task load;
A1、B1、C1、D1respectively unmanned plane in P1When the unmanned aerial vehicle task load is in point, boundary points of a scanning range of the current position of the unmanned aerial vehicle task load are determined; a. the1D1Arc is near-bound, A1At the left boundary, D1At the right border, B1C1Arc as far bound, B1At the left boundary, C1The right border; m1And N1Are respectively A1D1Arc line and B1C1A center point of the arc;
in the same way, P2The point is the point where the unmanned aerial vehicle is calculated next time and is also the point where the task load is located; description and definition of information of other points and P1The corresponding points are consistent;
order:
u1 is all waypoint scan areas;
u2 is all invisible areas;
u3 is the intersection of the scanning area and the task area;
u4 is the intersection of the invisible area and the task area;
and calculating and processing the two-dimensional graph to calculate the scout coverage rate.
In one embodiment of the present invention, a DYNTACS visibility algorithm is employed in step (4).
In a specific embodiment of the present invention, in step (5), the sampling interval is an integer multiple of the azimuth-direction minimum subdivision distance d.
In another specific embodiment of the present invention, in step (6), the recursive sampling interval is set to be an integer multiple of the azimuth-direction minimum subdivision distance d, with d having a minimum value of 1 and a maximum value of m.
In another embodiment of the present invention, in step (6), the two-dimensional graph is calculated and processed, specifically including the following steps:
computing all task area sets;
secondly, merging all navigation point scanning through areas;
taking out single task area from the task area set;
fourthly, arranging the data of the boundary points in the task area in a reverse order;
calculating the intersection of the area data and the visual area to obtain area boundary points;
sixthly, calculating the intersection area through the boundary points of the intersection area;
seventhly, dividing the intersection area by the area of the task area to obtain the scanning coverage rate of the task area;
and eighthly, repeating the step three to the step seven to calculate the coverage rate of all the task areas.
The invention aims at various complex terrains such as mountains, hills and the like except plain terrains, uses a unified reconnaissance coverage rate calculation mode, calculates the reconnaissance coverage range and the reconnaissance coverage rate by introducing an elevation terrain database and carrying out through-vision analysis without changing the software and hardware architecture of an original system of a mission planning system, provides a reconnaissance coverage rate calculation method for meeting the requirement of being closer to a real mission target under various landforms and features, and provides a new thought for planning of a navigation path in a mission area stage in mission planning.
The invention has the following advantages:
(1) and realizing the support of the route planning on the non-planar terrain. According to the method, the original two-dimensional task area is three-dimensionally changed by using a landscape calculation mode based on a terrain elevation model, terrain shielding factors of task load reconnaissance coverage ranges under different terrains are fully considered, the applicability of route planning is improved from the original plain terrains to the support of various landforms, and the available range of route planning is improved.
(2) And realizing quantitative description of reconnaissance coverage rate of the established route. On the basis of landscape-based on-sight analysis, a set route is calculated through plotting and statistics, and coverage rate data calculation of a reconnaissance range can be completed. By the method, the method can be used as a quantitative basis for planning the optimization of the air route, and errors caused by manual and subjective judgment of an operator can be avoided.
Drawings
FIG. 1 shows a schematic view of a reconnaissance coverage;
FIG. 2 shows a schematic horizontal projection of a reconnaissance coverage;
FIG. 3 shows a schematic view of a scout coverage vertical projection;
fig. 4 shows a schematic computed view of a scout coverage center M, N point (taking into account the earth curvature);
FIG. 5 shows a spherical triangular schematic;
FIG. 6 shows a simplified schematic diagram of a spherical triangle;
FIG. 7 shows a rasterization schematic;
FIG. 8 shows a perspective computational diagram;
FIG. 9 shows a perspective algorithm diagram;
FIG. 10 shows a perspective calculation result diagram;
FIG. 11 shows a scan region merge diagram;
FIG. 12 shows a merged calculations schematic;
fig. 13 shows a non-see-through region merging calculation diagram.
Detailed Description
Aiming at the increasingly refined task planning requirement of the unmanned aerial vehicle, the method is based on the full consideration of the common air route planning constraint condition and the common planning algorithm. By calculating the coverage rate of the task area of the pre-planned route in the task area, the effectiveness evaluation of the planned route can be effectively improved, and a basis is provided for the rationality optimization of the route in the task stage of route planning. When the coverage rate of the mission area is calculated, the equivalent ground model is used for calculating the specific coverage range, and although the method can also calculate the approximate coverage range result, the influence of the real terrain on the coverage range cannot be considered. Therefore, the invention provides the calculation of the scout coverage rate based on the elevation terrain through-vision analysis. The general idea is as follows: for the task area, the longitude and latitude attributes of the boundary of the task area are obtained, the topographic relief condition in the range is obtained through the elevation data, the two-dimensional map is subjected to three-dimensional transformation, and the communication analysis in the three-dimensional space is carried out by combining the position of the unmanned aerial vehicle and the working state of the task load, so that the coverage range and the coverage rate are calculated, and the degree of reality and accuracy are improved. The calculation method process is explained as follows:
through the through-sight analysis carried out by introducing terrain elevation data, the accurate calculation of the mission load reconnaissance coverage and coverage rate under the condition of a mission area flight path is carried out, and the technology can be applied to the following conditions (basic assumption):
6) when the unmanned aerial vehicle executes the reconnaissance mission, the unmanned aerial vehicle is in a horizontal flight state, namely the unmanned aerial vehicle is in a state that the pitching and rolling are both 0 degree;
7) the landform of the task area can be either a mountain landform or a plain landform, but the elevation data of the area exists;
8) the conditions of ground buildings, vegetation and other ground surface coverings are not considered;
9) the condition that optical task loads such as photoelectric gondolas (EO) are shielded by clouds, rain and snow under different weather conditions is not considered;
10) the situation that the complex electromagnetic environment generates interference on task loads such as radars and the like is not considered.
(1) Data preparation
In the stage, task area data, task load parameters and state data, route point data and the like formed on the basis of the top-level task target are obtained. And performing two-dimensional calibration on the data to prepare data for the next calculation, specifically including obtaining unmanned aerial vehicle waypoints, task areas, task load state data, task load performance parameters and the like. Methods for two-dimensional calibration of the above data are well known to those skilled in the art and will not be described in further detail. In the mission planning phase, the known data is as follows:
j) longitude and latitude altitude (B) of unmanned plane position P under WGS84 coordinate system1,L1,H):
k) The pitch angle theta of the center of the visual axis of the task load of the unmanned aerial vehicle is considered as a set parameter by an operator, the value range is 0-90 degrees, the task load is horizontally 0 degrees and vertically 90 degrees;
l) unmanned aerial vehicle course angle psi, obtained in real time from the calculation;
m) a drift angle gamma of the unmanned aerial vehicle, which is obtained in real time by calculation;
n) is free ofMan-machine task load field angle horizontal size VhAnd a vertical size VvThe two values are parameters of the task load and can be conveniently calculated according to the working mode selected by an operator and the set parameters; near-range pitch angle of unmanned aerial vehicle under current state of task load
Figure BSA0000264391230000161
o) remote pitch angle of unmanned aerial vehicle under current state of task load
Figure BSA0000264391230000162
p) unmanned aerial vehicle mission load limit pitch angle thetamaxThe value is the inherent attribute of the task load, and can be confirmed by inquiring a task load manual, and the value range is 0-90 degrees;
q) maximum acting distance of unmanned aerial vehicle task load is LmaxThe value is the inherent attribute of the task load and can be confirmed by inquiring a task load manual;
r) the earth is approximately a sphere with a radius R6371000 in meters.
(2) Scan boundary calculation
In this stage, scanning boundary points are calculated based on the waypoint data, the task load parameters and the state data. The calculation process takes into account the effects of the earth's curvature, but not the effects of the terrain elevation first. The reconnaissance coverage rate calculation is applied to the task pre-planning stage, the task load is ideally used in the unmanned plane flight stage (the pitching and rolling in the unmanned plane attitude are 0 degrees), and the task load pre-setting mode is a left side view or a right side view. Taking the action range of the task load on the ground signal as a fan as a basic assumption, the reconnaissance coverage range in which one sampling can be obtained is schematically shown in the attached drawing 1, the horizontal projection is drawn as shown in the attached drawing 2, and the vertical view is drawn as shown in the attached drawing 3. Wherein the content of the first and second substances,
the point P is the current position of the unmanned aerial vehicle;
POprojecting it on the ground;
the point A is the intersection point of the projection of the left boundary of the task load view field and the near boundary of the task load detection range on the ground;
the point B is an intersection point of the left boundary of the task load view field and the far boundary of the task load detection range projected on the ground;
the point C is an intersection point of the right boundary of the task load view field and the far boundary of the task load detection range projected on the ground;
and the point D is the intersection point of the right boundary of the task load view field and the near boundary of the task load detection range projected on the ground.
The AD arc line is a near boundary of a scanning range of the current position of the unmanned aerial vehicle task load;
the BC arc line is a far boundary of a scanning range of the current position of the unmanned aerial vehicle task load;
the M point is an intersection point of the projection of the central line of the visual axis of the task load and the near boundary of the detection range of the task load on the ground;
the N point is an intersection point of the central line of the visual axis of the task load and the projection of the remote boundary of the detection range of the task load on the ground;
the MN connecting line is a task load view field central line and is perpendicular to the flight direction of the unmanned aerial vehicle (the included angle between the MN connecting line and the flight direction is 90 degrees);
A. the longitude and latitude of the B, C, D, M, N point can be obtained by setting parameters such as the height of the unmanned aerial vehicle, the maximum acting slope distance of the task load, the maximum pitch angle of the task load and the like. The specific calculation method is as follows:
calculating the latitude and longitude of M point
b) Central angle calculation
For more accurate calculation, the earth is approximated to be a sphere, and the curvature of the earth is considered, as shown in fig. 4, the projection point P of the unmanned aerial vehicle position P and the geocentric O, P point to the sphere is knownONear-boundary pitch angle theta of task load1("OPM"), earth radius R, flight height H, and maximum working distance L of mission loadmaxUltimate pitch angle theta of mission loadmaxAnd the intersection point M 'of the ray emitted by the unmanned aerial vehicle position and the spherical surface when the angle of maximum pitch is achieved, and the tangent point N' of the ray emitted by the unmanned aerial vehicle position and the spherical surface. According to the cosine theorem, the first central angle theta can be preferentially calculated3(. MOP) and second central angle
Figure BSA0000264391230000183
(NOP). The method comprises the following specific steps:
i. calculating the maximum acting slant distance in the visual distance range of the task load
Figure BSA0000264391230000181
Calculating minimum pitch angle OPN'
Figure BSA0000264391230000182
Calculating the shortest slope distance PM' which can be reached by the task load, and introducing parameters according to the cosine theorem, wherein the formula is as follows (3):
PM′2+(R+H)2-2*PM′*(R+H)*cos(90°-θmax)=R2(3) the available PM' is calculated. The parameters in the formula have already been described above, and are not described in detail.
iv, checking the near-range pitch angle theta of the task load1
To ensure validity of the calculation, i.e. the calculated maximum pitch angle θ1Within a reasonable range of values, a check is required. If theta is greater than theta1Less than the minimum value of pitch angle < OPN1Is equal to OPN', if the value is larger than the physical maximum pitch value theta of the task loadmax(top level input or preset), then θ1=θmaxEqual to OPM', otherwise, the value is not changed. In this way the task load is controlled to be within the effective and available range.
v. checking the maximum action distance Lmax
If the maximum acting distance L of the task loadmaxGreater than the maximum slope PN', then LmaxPN', if the maximum distance L of the mission loadmaxLess than minimum pitch PM', then LmaxPM', otherwise LmaxThe value is unchanged. In this way the maximum working distance of the task load is controlled to be effective and within the available range.
Calculating a first central angle θ3PM is a preset maximum pitchAngle thetamaxCorresponding slope distance.
The PM value can be calculated according to the cosine theorem of:
PM2+(R+H)2-2*PM*(R+H)*cos(90°-θ1)=R2 (4)
again using the cosine theorem as follows:
R2+(R+H)2-2*R*(R+H)*cos(θ3)=PM2 (5)
can calculate theta3
Figure BSA0000264391230000191
vii. same reason, because LmaxFor the known value, the cosine theorem can be used to directly calculate the second central angle corresponding to the points N and P
Figure BSA0000264391230000192
Figure BSA0000264391230000193
Comparison of θ3And
Figure BSA0000264391230000194
if theta is greater than theta3Is greater than
Figure BSA0000264391230000195
Exchange theta3And
Figure BSA0000264391230000196
a value of (d); otherwise, no changes are made (the purpose of this step is to unify the coordinate system into due north mode for ease of calculation).
b) M Point latitude and longitude calculation
In calculating theta3And
Figure BSA0000264391230000201
after the value of (3), the longitude and latitude of the M point are further calculated, and the calculation schematic is shown in fig. 5. At the moment, the position of the unmanned aerial vehicle is known to be in the spherical orthographic projection point POLongitude, latitude, course angle and drift angle (L)1,B1,ψ,γ),POThe central angle from point to point M (i.e. theta, as determined above)3) Q is north, as shown in FIG. 5, POM, Q form a spherical triangle, B1、B2、L1、L2Are respectively POLatitude and longitude coordinates of the M two points. The longitude and latitude coordinates of the M points can be calculated by utilizing a spherical triangle calculation formula.
For the convenience of calculation, simplification is required. The simplified spherical triangle is shown in fig. 6.
Wherein: q, P0And the arc lengths of the spherical triangle corresponding to M are a, B and C respectively, and A ', B ' and C ' are angles on the spherical surface. Angle A' is angle P0For brevity of QM, angle B' is angle QP0For short, M, and angle C' is angle QMP0A is the arc length P on the circle passing through the center of the sphere0M, b is the arc length QM on the center circle of the sphere, c is the arc length P on the center circle of the sphere0Q。
As shown in fig. 6, in the spherical triangle P0In MQ, P is known0Latitude of point B1Longitude L1Heading angle psi, drift angle gamma, central angle P0OM, calculating the latitude B of the target point M2And longitude L2
i. Calculate M Point longitude L2
Known angle A' is L2-L1,c=∠P0OQ=90-B1,b=∠MOQ=90-B2,a=∠P0And OM. a. b and c are spherical surfaces with the arc length of a spherical triangle and the radius of a unit length, and the arc length of the spherical surface is equal to the angle, so that the spherical surface can be expressed by the angle.
Since PM is perpendicular to the lane direction, < B' is 90 ° + ψ + γ.
According to the spherical sine formula:
Figure BSA0000264391230000202
substitution is knownConditions, one can obtain:
sin a sin B′=sin(90°-B2)sin A′ (8)
according to the spherical triangle cotangent formula: the cot a sin c ═ cot a ' sin B ' + cos B ' cos c, and by substituting known conditions, the following can be obtained:
cot a·sin(90°-B1)=cot(A′)·sin B′+cos B′·cos(90°-B1) (9)
and (3) after simplification:
Figure BSA0000264391230000211
the longitude L can be calculated2
ii, calculating the latitude B of the M point2
At the point of calculating M latitude B2Before checking the B' first,
if B' is zero, it indicates that the target point M and the point P are at the same latitude, B2=B1+∠P0OM;
If the value of B' is not zero, then it is necessary to determine B by determining the quadrant2The value of (c). The calculation formula is as follows:
cos B2=cos a·sin B1+sin a·cos B1 (11)
the conversion is obtained by equation (7):
Figure BSA0000264391230000212
judgment of L2-L1If L is a value within the range of2-L1Sin B if greater than zero2The value is not changed, otherwise sin B2The value of (c) takes a negative sign.
Determination of cos B2If cos B2If the value is larger than or equal to zero, b is in the first quadrant and the fourth quadrant, otherwise, b is in the second quadrant and the third quadrant. On the basis of the above, by judging sin B2The value range determines the quadrant where b is located (the cos value can only be judged to be located in the two quadrants, and the base isThe unique quadrant can be determined based on the sin value).
When cos B2Is greater than zero and sin B2Greater than or equal to zero, then b is in the first quadrant
B2=90°-csc(sin B2) (13)
When cos B2Is greater than or equal to zero and sin B2Less than zero, then b is in the fourth quadrant
B2=90°+csc(sin B2) (14)
When cos B2Is less than zero and sin B2Greater than or equal to zero, then b is in the second quadrant
B2=90°+csc(sin B2) (15)
When cos B2Is less than zero and sin B2Less than zero, b is in the third quadrant
B2=90°-csc(sin B2) (16)
Thus, the longitude and latitude of the M point are calculated. By repeating the above calculation steps, the longitude and latitude of N, A, B, C, D can be calculated.
(3) And (5) processing a scanning area. The scanning zone is processed using a rasterization method, which is well known to those skilled in the art and will not be described again.
1) Terrain elevation acquisition
After the longitude and latitude of each point under the influence of the curvature of the earth are obtained, the altitude of each point A, B, C, D, M, N can be determined by substituting the longitude and latitude into the elevation data.
2) Scanning range rasterization
The rasterized scout coverage is as shown in fig. 7, and by rasterizing an area composed of A, B, C, D four points, m is equally divided in the AD direction and n is equally divided in the AB direction, m × n sub-quadrilateral areas can be obtained, and the central points of the sub-quadrilateral areas are stored to prepare for being used as a perspective sampling point.
Several parameters that are mainly used for rasterization are as follows:
m: an azimuth discretization equally-dividing coefficient, wherein m is an integer, the minimum value is 1, the smaller the value is, the coarser the subdivision granularity is, the larger the result deviation is (the higher the calculation efficiency is), and the result deviation is input and set by a planner;
n: the distance direction discretization coefficient is set in the same direction, and m and n are not necessarily equal;
③ m × n: detecting discrete points of a coverage range by a primary task load;
d: azimuth minimum subdivision distance, d ═ LADAngle m, wherein LADIs a line connecting the horizontal projection points of A, D.
(4) Grid small region visibility calculation
And performing the through-vision calculation of each grid small area by using the terrain elevation data, the current position data of the unmanned aerial vehicle and the rasterized data, so as to obtain the through-vision result of each small area. And during the sight analysis, whether the two points are visible or not is calculated and judged by combining the elevation terrain according to the current position of the unmanned aerial vehicle and the position of a target point. The perspective analysis of the view point and the target point combined with the terrain elevation is schematically shown in fig. 8. The figure shows a terrain profile obtained by longitudinally cutting the terrain through a flight profile of the unmanned aerial vehicle, the real-time position of the unmanned aerial vehicle is connected with a target point, if the unmanned aerial vehicle is intersected with the terrain profile, the unmanned aerial vehicle is shown to be invisible (an invisible area in the figure) from the current point to the target, otherwise, the unmanned aerial vehicle is shown to be visible.
There are many kinds of visibility algorithms, and the dynatacs visibility algorithm is adopted in the present invention, which is well known to those skilled in the art and will not be described again. As shown in fig. 9.
Whether the task load of the unmanned aerial vehicle is the ground target or not and whether the ground target is the unmanned aerial vehicle or not are judged according to the same basis. For convenience of calculation, a ground target is used as a viewpoint (V point), a detection distance of a task load of the unmanned aerial vehicle is used as an action range (converted into a ground distance), and a boundary position (a certain direction is selected first) is used as a target point (T point). The intersection points are then analyzed point by point from the viewpoint to the target point. Wherein F (x)i,yi)、G(xi+1,yi+1) …, the slope α of the visual line VT is calculated by the following formula:
Figure BSA0000264391230000231
wherein:
ZTis T (x)T,yT) The height of the point;
ZVis V (x)V,yV) The height of the point;
Figure BSA0000264391230000232
is the horizontal distance from point T to point V.
A coordinate system is established by taking the eastward reference horizontal plane as an X axis and the upward vertical direction of a V point as a Y axis, and an included angle beta is formed between the connecting line of the viewpoint V and each intersection point and the horizontal planeiThe slope is as follows:
Figure BSA0000264391230000233
wherein:
Ziis F (x)i,yi) The height of the point;
ZVis V (x)V,yV) The height of the point;
Figure BSA0000264391230000234
is the horizontal distance from point F to point V.
The visibility can be judged by comparing tan alpha and tan betaiIs completed by the value of (beta)iThe value is obtained by equation (18):
if tan betaiIf the alpha is larger than tan alpha, the visibility is not allowed, and the calculation is finished;
if tan betaiIf tan alpha is not greater, F point is looked through, and the next point G (x) is calculatedi+1,yi+1). If the target point T can be reached, the viewpoint and the target point can be viewed through. The F point is not necessarily the first intersection point of the projection of the visual line VT on the (x, y) plane and the edge of the square grid cell, G (x)i+1,yi+1) Nor necessarily the second or the one immediately after FAnd (4) point.
And completing one-time grid small-area communication calculation based on the data of the waypoint data, the waypoint and the like into one-time sampling calculation according to a formula (17), a formula (18) and the communication judgment calculation mode, and after completing the one-time sampling calculation and obtaining a result, performing iterative calculation according to the parameters of the grid, the azimuth direction and the distance direction, and so on, so as to obtain the communication and non-communication grids in the one-time sampling coverage range. The calculated see-through area is shown in fig. 10. Each point in the grid of fig. 10 performs visibility calculation with a point where the current unmanned aerial vehicle is located, the grid where the invisible point is located is filled with black, and the visible point is not marked (a small dot in the drawing).
(5) Merge calculations
Grid small-area visibility calculation of each visibility calculation point is performed based on the data such as waypoint data and waypoint, and visibility results are merged to obtain a perspective and non-perspective grid map as shown in fig. 13. In addition, the step also synchronously calculates the combination result of the scanning areas. Specifically, after the primary sampling calculation is completed (formula (17) and formula (18)), the through-view and non-through-view raster maps as shown in fig. 10 are obtained, and by analogy, the scanning areas of all the waypoints are calculated according to the waypoint data and the split interval, the through-view and non-through-view raster maps as shown in fig. 12 are obtained, and the merging calculation is performed. And setting the scanning areas of all waypoints as U1, the invisible areas as U2, the intersection of the scanning area and the task area as U3 and the intersection of the invisible areas and the task area as U4. The concrete algorithm steps of the combination calculation are as follows:
firstly, as shown in fig. 11, waypoints 1 and waypoints 2 represent any 2 waypoints on the mission route of the unmanned aerial vehicle, and the calculation of other waypoints is analogized. And according to the sampling interval (which is suggested to be an integral multiple of the minimum subdivision distance d in the azimuth direction), sliding out of all the interpolated waypoints in the middle of the waypoint 1 and the waypoint 2 to acquire scanning area data generated by all the waypoints, wherein the scanning area data is a plurality of sectors as shown in fig. 11 (each sector is scanning area data generated by a certain waypoint between the waypoint 1 and the waypoint 2, sliding out of all the interpolated waypoints in the middle of the waypoint 1 and the waypoint 2 to acquire scanning area data generated by all the waypoints is equivalent to interpolated sampling, and each interpolated point calculates a sector).
Secondly, arranging and storing boundary points of the polygonal area in a counterclockwise direction;
combining the sector areas to obtain polygons formed by the boundary points of the scanning areas, traversing the polygon areas formed by all the scanning areas, and merging the areas and the polygons in the polygon areas one by one (the way of merging the areas and the traversing method are well known by those skilled in the art and will not be described again).
And fourthly, calculating the data of the boundary points of the polygon areas after merging based on the traversing and the result of finding the merging one by one, wherein the merged areas are represented by the thick lines of the outermost circles in the figure 11 (the method is well known by persons skilled in the art and is not described repeatedly).
(6) Calculating scout coverage
And the step of calculating the intersection of the combined and calculated sight-through area and the task area so as to obtain the reconnaissance coverage and further calculating the reconnaissance coverage.
And (4) recursion along the course direction is set (the recursion sampling interval is set to be an integral multiple of the minimum subdivision distance d in the azimuth direction, the minimum value of d is 1 and the maximum value of d is m in the calculation mode), and the reconnaissance coverage range of the whole course can be obtained according to the intersection U3 of the panoramic area and the task area U4 calculated in the steps (i) - (iv).
In fig. 12:
P1the unmanned aerial vehicle is at the point when the unmanned aerial vehicle is calculated for the first time, and the point is also at the point of the task load;
A1、B1、C1、D1respectively unmanned plane in P1And when the unmanned aerial vehicle is in point, the boundary point of the scanning range of the current position of the unmanned aerial vehicle task load. A. the1D1Arc is near-bound, A1At the left boundary, D1At the right border, B1C1Arc as far bound, B1At the left boundary, C1At the right border. M1And N1Are respectively A1D1Arc line and B1C1The center point of the arc.
In the same way, P2The point is the point where the unmanned aerial vehicle is calculated next time and is also the point where the task load is located; description and definition of information of other points and P1The corresponding points are consistent.
In fig. 13:
u1 is all waypoint scan areas;
u2 is all invisible areas;
u3 is the intersection of the scanning area and the task area;
u4 is the intersection of the blind area and the task area. The lowermost one of the convolutions in fig. 13 is U4.
The part is mainly used for calculating and processing the two-dimensional graph, and the mathematical complexity is simple, so that the detailed description is omitted. The two-dimensional graph calculation part mainly describes an algorithm ((1) Wei xu Qing 'algorithm for calculating intersection and union area of polygons, institute of mathematics and computer science of university of Hunan and Specification, and (2) He's computer graphics) as follows:
computing all task area sets;
secondly, merging all navigation point scanning through areas;
taking out single task area from the task area set;
fourthly, arranging the data of the boundary points in the task area in a reverse order;
calculating the intersection of the area data and the visual area to obtain area boundary points;
sixthly, calculating the intersection area through the boundary points of the intersection area;
seventhly, dividing the intersection area by the area of the task area to obtain the scanning coverage rate of the task area;
and repeating the third step and the fourth step to calculate the coverage rate of all the task areas.
And ninthly, returning the coverage rate of the task area and the boundary point of the intersection area of the task area.
Aiming at wider task load detection application environment, the invention mainly solves the following problems:
(1) how to realize the calculation of single-point target reconnaissance monitoring under various terrains;
(2) how to realize coverage scouting of an area under various terrains;
(3) and (4) planning a reasonable route based on the regional reconnaissance requirement.
To this end, the invention makes it possible to:
(1) under the condition that the original unmanned aerial vehicle planning service flow is not changed, the visibility analysis calculation is carried out by introducing elevation data, so that the reconnaissance coverage rate can be calculated more accurately;
(2) under the condition that the use mode of the original unmanned aerial vehicle is not changed, the more reasonable air route planning of the unmanned aerial vehicle is realized through the coverage rate result which is accurately calculated;
(3) the method based on the technology can be expanded and applied to similar fields (such as unmanned aerial vehicle link planning), so that the method has better applicability.
Based on the application requirements, the reasonability and the usability of the task planning result can be greatly improved by combining a mode of calculating the reconnaissance coverage rate of the task area through landscape elevation data.

Claims (5)

1. A unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on terrain visibility is characterized by comprising the following steps:
through the through-sight analysis carried out by introducing terrain elevation data, the accurate calculation of the task load reconnaissance coverage area and the coverage rate under the condition of a flight path in a task area is carried out, and the technology is suitable for the following conditions:
11) when the unmanned aerial vehicle executes the reconnaissance mission, the unmanned aerial vehicle is in a horizontal flight state, namely the unmanned aerial vehicle is in a state that the pitching and rolling are both 0 degree;
12) the landform of the task area can be either a mountain landform or a plain landform, but the elevation data of the area exists;
13) the conditions of ground buildings, vegetation and other ground surface coverings are not considered;
14) the condition that optical task loads such as the photoelectric pod EO and the like are shielded by cloud layers, rain and snow under different weather conditions is not considered;
15) the situation that the complex electromagnetic environment generates interference on task loads such as radars and the like is not considered;
(1) data preparation
The stage obtains task area data, task load parameters and state data and route point data which are formed based on a top-layer task target; performing two-dimensional calibration on the data to prepare data for the next calculation, specifically including obtaining unmanned aerial vehicle waypoints, task areas, task load state data and task load performance parameters; in the mission planning phase, the known data is as follows:
s) longitude and latitude elevation (B) of unmanned plane position P under WGS84 coordinate system1,L1,H):
t) a pitch angle theta of the center of a task load visual axis of the unmanned aerial vehicle, which is considered as a set parameter by an operator, is in a range of 0-90 degrees, the task load is horizontally 0 degrees and vertically 90 degrees;
u) unmanned aerial vehicle course angle psi, obtained in real time from the calculation;
v) an unmanned aerial vehicle drift angle gamma, which is obtained in real time by calculation;
w) unmanned aerial vehicle task load field angle horizontal size VhAnd a vertical size VvThe two values are parameters of the task load and can be conveniently calculated according to the working mode selected by an operator and the set parameters; near-field pitch angle of unmanned aerial vehicle under current state of task load
Figure FSA0000264391220000021
x) remote pitch angle of unmanned aerial vehicle under current state of task load
Figure FSA0000264391220000022
y) unmanned aerial vehicle mission load limit pitch angle thetamaxThe value is the inherent attribute of the task load, the task load manual is inquired for confirmation, and the value range is 0-90 degrees;
z) maximum acting distance of unmanned aerial vehicle task load is LmaxThe value is the inherent attribute of the task load, and the task load manual is inquired for confirmation;
aa) the earth is approximately a sphere with a radius R6371000 in meters;
(2) scan boundary calculation
In the stage, scanning boundary points are calculated based on the airway point data, the task load parameters and the state data; the calculation process considers the influence of the earth curvature, but does not consider the influence of the terrain elevation; the reconnaissance coverage rate calculation is applied to a task pre-planning stage, the task load is ideally used in a flat flight stage of the unmanned aerial vehicle, the pitching and rolling in the attitude of the unmanned aerial vehicle are 0 degrees in the stage, and the task load preset mode is a left side view or a right side view; taking the action range of the task load on the ground signal as a sector as a basic assumption to obtain a reconnaissance coverage range of one-time sampling; order to
The point P is the current position of the unmanned aerial vehicle;
POprojecting it on the ground;
the point A is the intersection point of the projection of the left boundary of the task load view field and the near boundary of the task load detection range on the ground;
the point B is the intersection point of the left boundary of the task load view field and the far boundary of the task load detection range projected on the ground;
the point C is an intersection point of the right boundary of the task load view field and the far boundary of the task load detection range projected on the ground;
the point D is the intersection point of the projection of the right boundary of the task load view field and the near boundary of the task load detection range on the ground;
the AD arc line is a near boundary of a scanning range of the current position of the unmanned aerial vehicle task load;
the BC arc line is a far boundary of a scanning range of the current position of the unmanned aerial vehicle task load;
the M point is an intersection point of the projection of the central line of the visual axis of the task load and the near boundary of the detection range of the task load on the ground;
the N point is an intersection point of the central line of the visual axis of the task load and the projection of the remote boundary of the detection range of the task load on the ground;
the MN connecting line is a task load view field central line and is perpendicular to the flight direction of the unmanned aerial vehicle, namely, the included angle of 90 degrees with the flight direction is formed;
A. the longitude and latitude of the point B, C, D, M, N are obtained by setting parameters such as the height of the unmanned aerial vehicle, the maximum acting slant distance of the task load and the maximum pitch angle of the task load; the specific calculation method is as follows:
firstly, calculating the latitude and longitude of M points
c) Central angle calculation
To calculate more accurately, the earth needs to be approximated to be a sphere, and the curvature of the earth is considered, and the projection point P from the unmanned aerial vehicle position P and the geocentric O, P point to the spherical surface is knownONear-boundary pitch angle theta of task load1equal-OPM, earth radius R, flight height H and maximum acting distance L of mission loadmaxUltimate pitch angle theta of mission loadmaxThe intersection point M 'of the ray emitted by the unmanned aerial vehicle and the spherical surface at the maximum pitch angle, and the tangent point N' of the ray emitted by the unmanned aerial vehicle and the spherical surface; according to the cosine theorem, the first central angle theta is preferentially calculated3Is equal to MOP and a second central angle
Figure FSA0000264391220000031
The method comprises the following specific steps:
i. calculating the maximum acting slant distance in the visual distance range of the task load
Figure FSA0000264391220000032
Calculating minimum pitch angle OPN'
Figure FSA0000264391220000033
Calculating the shortest slope distance PM' which can be reached by the task load, and introducing parameters according to the cosine theorem, wherein the formula is as follows (3):
PM′2+(R+H)2-2*PM′*(R+H)*cos(90°-θmax)=R2 (3)
calculating to obtain PM';
iv, checking the near-range pitch angle theta of the task load1
To ensure validity of the calculation, i.e. the calculated maximum pitch angle θ1Within a reasonable value range, checking is required; if theta is greater than theta1Less than the minimum value of pitch angle < OPN1=∠OPN′,If the value is larger than the physical maximum pitch value theta of the task loadmaxInput through the top layer or preset, then1=θmaxThe value is equal to OPM', otherwise, the value is unchanged; in this way, the task load is controlled to be within the effective and available range;
v. checking the maximum action distance Lmax
If the maximum acting distance L of the task loadmaxGreater than the maximum slope PN', then LmaxPN', if the maximum distance L of the mission loadmaxLess than the minimum pitch PM', then LmaxPM', otherwise LmaxThe value is unchanged; the method controls the effective maximum acting distance of the task load to be within the available range;
calculating a first central angle θ3PM is a preset maximum pitch angle thetamaxCorresponding slope distance;
the PM value is calculated according to the cosine theorem of:
PM2+(R+H)2-2*PM*(R+H)*cos(90°-θ1)=R2 (4)
again using the cosine theorem as follows:
R2+(R+H)2-2*R*(R+H)*cos(θ3)=PM2 (5)
calculate theta3
Figure FSA0000264391220000041
Similarly, due to LmaxDirectly calculating the second central angle corresponding to the N and P points by using the cosine law for the known value
Figure FSA0000264391220000042
Figure FSA0000264391220000043
Comparison of θ3And
Figure FSA0000264391220000051
if theta is greater than theta3Is greater than
Figure FSA0000264391220000052
Exchange theta3And
Figure FSA0000264391220000053
a value of (d); otherwise, no change is made;
b) m Point latitude and longitude calculation
In calculating theta3And
Figure FSA0000264391220000054
after the value is obtained, further calculating the longitude and latitude of the M point; at the moment, the position of the unmanned aerial vehicle is known to be in the spherical orthographic projection point POLongitude, latitude, course angle and drift angle (L)1,B1,ψ,γ),POCenter angle θ from point to point M3Q is north pole, POM, Q form a spherical triangle, B1、B2、L1、L2Are respectively POLatitude and longitude coordinates of the M two points; calculating the longitude and latitude coordinates of the M points by using a spherical triangle calculation formula;
for convenience of calculation, the spherical triangle needs to be simplified;
wherein: q, P0The arc lengths of the spherical triangle corresponding to M are a, B and C respectively, and A ', B ' and C ' are angles on the spherical surface; angle A' is angle P0For brevity of QM, angle B' is angle QP0For short, M, and angle C' is angle QMP0A is the arc length P on the circle passing through the center of the sphere0M, b is the arc length QM on the center circle of the sphere, c is the arc length P on the center circle of the sphere0Q;
In a spherical triangle P0In MQ, P is known0Latitude of point B1Longitude L1Heading angle psi, drift angle gamma, central angle P0OM, calculating the latitude B of the target point M2And longitude L2
i. Calculate M Point longitude L2
Known angle A' is L2-L1,c=∠P0OQ=90-B1,b=∠MOQ=90-B2,a=∠P0OM; a. b and c are spherical surfaces with the arc length of a spherical triangle and the radius of a unit length, and the arc length is equal to the angle, so the arc length is expressed by the angle;
since PM is perpendicular to the lane direction, < B' > 90 ° + ψ + γ;
according to the spherical sine formula:
Figure FSA0000264391220000055
substituting known conditions to obtain:
sin a sin B′=sin(90°-B2)sin A′ (8)
according to the spherical triangle cotangent formula: cot a sin c ═ cot a ' sin B ' + cos B ' cos c, substituting known conditions, yields:
cot a·sin(90°-B1)=cot(A′)·sin B′+cos B′·cos(90°-B1) (9)
and (3) after simplification:
Figure FSA0000264391220000061
i.e. the longitude L can be calculated2
ii, calculating the latitude B of the M point2
At the point of calculating M latitude B2Before checking the B' first,
>>if the value of B' is zero, the target point M and the point P are at the same latitude, B2=B1+∠P0OM;
>>If the B' value is not zero, B needs to be determined by judging the quadrant2A value of (d); the calculation formula is as follows:
cos B2=cos a·sin B1+sin a·cos B1 (11)
converted by equation (7):
Figure FSA0000264391220000062
judgment of L2-L1If L is a value within the range of2-L1Sin B if greater than zero2The value is not changed, otherwise sin B2The value of (A) is negative;
determination of cos B2If cos B2B is in the first quadrant and the fourth quadrant if the value is larger than or equal to zero, otherwise, b is in the second quadrant and the third quadrant; on the basis of the above, by judging sin B2Determining the quadrant of b in the value range;
>>when cos B2Is greater than zero and sin B2Greater than or equal to zero, then b is in the first quadrant
B2=90°-csc(sin B2) (13)
>>When cos B2Is greater than or equal to zero and sin B2Less than zero, then b is in the fourth quadrant
B2=90°+csc(sin B2) (14)
>>When cos B2Is less than zero and sin B2Greater than or equal to zero, then b is in the second quadrant
B2=90°+csc(sin B2) (15)
>>When cos B2Is less than zero and sin B2Less than zero, b is in the third quadrant
B2=90°-csc(sin B2) (16)
Therefore, the longitude and latitude of the M point are calculated; the longitude and latitude of N, A, B, C, D can be calculated by repeating the calculation steps;
(3) processing a scanning area;
the scanning area is processed by adopting a rasterization method;
1) terrain elevation acquisition
After the longitude and latitude of each point under the influence of the curvature of the earth are obtained, the altitude of each point A, B, C, D, M, N can be determined after the altitude data is brought by the longitude and latitude;
2) scanning range rasterization
Rasterizing an area formed by A, B, C, D four points, equally dividing the area in an AD direction by m, equally dividing the area in an AB direction by n to obtain m x n sub-quadrilateral areas, storing the central points of the sub-quadrilateral areas, and preparing for being used as a perspective sampling point;
the parameters used for rasterization are as follows:
m: the azimuth discretization equally dividing coefficient, m is an integer, the minimum value is 1, the smaller the value is, the coarser the subdivision granularity is, the larger the result deviation is, but the higher the calculation efficiency is, and the setting is input by a planner;
n: the distance direction discretization coefficient is set in the same direction, and m and n are not necessarily equal;
③ m × n: detecting discrete points of a coverage range by a primary task load;
d: azimuth minimum subdivision distance, d ═ LADM, wherein LADA, D is a line connecting the horizontal projection points;
(4) grid small region visibility calculation
Performing the visibility calculation of each grid small area by using the terrain elevation data, the current position data of the unmanned aerial vehicle and the rasterized data, namely obtaining the visibility result of each small area; during the sight-through analysis, whether two points are visible or not is calculated and judged by combining the elevation terrain according to the current position of the unmanned aerial vehicle and the position of a target point; the unmanned aerial vehicle flight profile cuts a terrain longitudinally to obtain a terrain profile, the real-time position of the unmanned aerial vehicle is connected with a target point, if the unmanned aerial vehicle is crossed with the terrain profile, the unmanned aerial vehicle is shown to be invisible from the current point and the target, otherwise, the unmanned aerial vehicle is shown to be visible;
whether the task load of the unmanned aerial vehicle is in full view of the ground target is the same as the judgment basis of whether the ground target is in full view of the unmanned aerial vehicle; for convenience of calculation, a ground target is used as a viewpoint-V point, the detection distance of the unmanned aerial vehicle task load is used as an action range, namely the detection distance is converted into a ground distance, the boundary position of the unmanned aerial vehicle task load is used as a target point-T point, and a certain direction is selected firstly; then analyzing the intersection point by point from the viewpoint to the target point; wherein F (x)i,yi)、G(xi+1,yi+1) … the projection of the visual line VT on the (x, y) plane is the same as the square grid sheetThe slope α of the visual line VT at the intersection of the element edges is calculated by the following equation:
Figure FSA0000264391220000081
wherein:
ZTis T (x)T,yT) The height of the point;
ZVis V (x)V,yV) The height of the point;
Figure FSA0000264391220000082
the horizontal distance from the point T to the point V;
a coordinate system is established by taking the eastward reference horizontal plane as an X axis and the upward vertical direction of a V point as a Y axis, and an included angle beta is formed between the connecting line of the viewpoint V and each intersection point and the horizontal planeiThe slope is as follows:
Figure FSA0000264391220000083
wherein:
Ziis F (x)i,yi) The height of the point;
ZVis V (x)V,yV) The height of the point;
Figure FSA0000264391220000091
is the horizontal distance from point F to point V;
the visibility is judged by comparing tan alpha and tan betaiThe value of (c) completes:
if tan betaiIf the alpha is larger than tan alpha, the visibility is not allowed, and the calculation is finished;
if tan betaiIf tan alpha is not greater, F point is looked through, and the next point G (x) is calculatedi+1,yi+1) (ii) a If the system can be pushed to the target point T all the time, the viewpoint and the target point can be seen through;
completing grid small-area communication calculation based on route point data, route points and other data to be primary sampling calculation according to a formula (17), a formula (18) and the communication judgment calculation mode, and after completing the primary sampling calculation and obtaining a result, performing iterative calculation according to the direction and the distance of grid parameters and the like, namely obtaining communication and non-communication grids in a primary sampling coverage range;
(5) merge calculations
Performing grid small-area visibility calculation of each visibility calculation point based on the data of the waypoint data, the waypoint and the like, and merging visibility results to obtain a visibility and non-visibility grid diagram; in addition, the step also synchronously calculates the combination result of the scanning areas; specifically, after one-time sampling calculation is completed, a perspective and non-perspective grid image is obtained, and by analogy, the scanning areas of all route points are calculated according to route point data and segmentation intervals, the perspective and non-perspective grid images are obtained, and merging calculation is carried out; setting the scanning areas of all waypoints as U1, the invisible areas as U2, the intersection of the scanning area and the task area as U3, and the intersection of the invisible areas and the task area as U4; the specific algorithm steps of the combination calculation are as follows:
firstly, making waypoints 1 and waypoints 2 represent any 2 waypoints on a mission air route of the unmanned aerial vehicle, and analogizing the calculation of other waypoints; sliding out all the interpolated waypoints between the waypoint 1 and the waypoint 2 according to the sampling interval, and acquiring scanning area data generated by all the waypoints, wherein the scanning area is a plurality of sectors, and each sector is scanning area data generated by a certain waypoint between the waypoint 1 and the waypoint 2;
secondly, arranging and storing boundary points of the polygonal area in a counterclockwise direction;
combining the sector areas to obtain polygons formed by the boundary points of the scanning areas, traversing all polygon areas formed by the scanning areas, and combining the currently combined areas and each polygon in the polygon areas one by one;
fourthly, calculating the data of the boundary points of the polygon areas after union set based on the traversing and the result of solving union set one by one;
(6) calculating scout coverage
Recursion is carried out along the direction of the course, and the reconnaissance coverage range of the whole course can be obtained according to the intersection U3 of the communication area and the task area U4 calculated in the steps I-II;
order:
P1the unmanned aerial vehicle is at the point when the unmanned aerial vehicle is calculated for the first time, and the point is also at the point of the task load;
A1、B1、C1、D1respectively unmanned plane in P1When the unmanned aerial vehicle task load is in point, boundary points of a scanning range of the current position of the unmanned aerial vehicle task load are determined; a. the1D1Arc is near-bound, A1At the left boundary, D1At the right border, B1C1Arc as far bound, B1At the left boundary, C1The right border; m1And N1Are respectively A1D1Arc line and B1C1A center point of the arc;
in the same way, P2The point is the point where the unmanned aerial vehicle is calculated next time and is also the point where the task load is located; description and definition of information and P of other points1The corresponding points are consistent;
order:
u1 is all waypoint scan areas;
u2 is all invisible areas;
u3 is the intersection of the scanning area and the task area;
u4 is the intersection of the invisible area and the task area;
and calculating and processing the two-dimensional graph to calculate the scout coverage rate.
2. The terrain visibility-based unmanned aerial vehicle task load reconnaissance coverage calculation method of claim 1, wherein a DYNTACS visibility algorithm is adopted in the step (4).
3. The terrain visibility-based unmanned aerial vehicle mission load reconnaissance coverage calculation method of claim 1, wherein in step (5), the sampling interval is an integer multiple of an azimuthally-oriented minimum subdivision distance d.
4. The method for calculating mission load reconnaissance coverage of a landform-based unmanned aerial vehicle according to claim 1, wherein in the step (6), the recursive sampling interval is set to be an integral multiple of a minimum subdivision distance d in the azimuth direction, the minimum value of d is 1, and the maximum value of d is m.
5. The method for calculating the mission load reconnaissance coverage rate of the unmanned aerial vehicle based on terrain visibility as claimed in claim 1, wherein in the step (6), the two-dimensional graph is calculated and processed, and the specific steps are as follows:
computing all task area sets;
secondly, merging all navigation point scanning through areas;
taking out single task area from the task area set;
fourthly, arranging the data of the boundary points in the task area in a reverse order;
calculating the intersection of the area data and the visual area to obtain area boundary points;
sixthly, calculating the intersection area through the boundary points of the intersection area;
dividing the intersection area by the area of the task area to obtain the scanning coverage rate of the task area;
and eighthly, repeating the step three to the step seven to calculate the coverage rate of all the task areas.
CN202210064053.4A 2022-01-16 2022-01-16 Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on topography Active CN114490815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210064053.4A CN114490815B (en) 2022-01-16 2022-01-16 Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on topography

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210064053.4A CN114490815B (en) 2022-01-16 2022-01-16 Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on topography

Publications (2)

Publication Number Publication Date
CN114490815A true CN114490815A (en) 2022-05-13
CN114490815B CN114490815B (en) 2024-04-02

Family

ID=81472107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210064053.4A Active CN114490815B (en) 2022-01-16 2022-01-16 Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on topography

Country Status (1)

Country Link
CN (1) CN114490815B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115331131A (en) * 2022-10-17 2022-11-11 四川腾盾科技有限公司 Unmanned aerial vehicle mission planning auxiliary decision-making method
CN116561835A (en) * 2023-07-05 2023-08-08 成都纵横自动化技术股份有限公司 Task planning method and system based on construction of ground projection geometric model
CN116701827A (en) * 2023-08-03 2023-09-05 中科星图测控技术股份有限公司 Quick topographic occlusion calculating method for space-sky target visibility analysis
CN117156111A (en) * 2023-10-27 2023-12-01 长春长光睿视光电技术有限责任公司 Coverage planning method of wide-area photoelectric imaging system based on static platform

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2557971B1 (en) * 1984-01-06 1988-05-27 Thomson Csf PILOTLESS AIRCRAFT MONITORING SYSTEM FOR OBJECTIVE LOCATION
RU2364895C1 (en) * 2007-12-17 2009-08-20 Горный институт Уральского отделения Российской академии наук (ГИ УрО РАН) Method for multicomponent gravimetric modeling of geological medium
CN103136393B (en) * 2011-11-28 2015-10-07 中国电子科技集团公司第五十四研究所 A kind of areal coverage computing method based on stress and strain model
RU2521755C1 (en) * 2013-01-10 2014-07-10 Федеральное государственное бюджетное учреждение науки Институт биологии Коми научного центра Уральского отделения Российской академии наук Technology of resource assessment of rangelands of reindeer on multispectral satellite data
CN109885102B (en) * 2019-03-18 2021-12-10 西安爱生技术集团公司 Automatic task route planning method suitable for photoelectric load unmanned aerial vehicle system
CN112418188A (en) * 2020-12-17 2021-02-26 成都亚讯星科科技股份有限公司 Crop growth whole-course digital assessment method based on unmanned aerial vehicle vision
CN113724392A (en) * 2021-07-22 2021-11-30 中国电子科技集团公司第二十八研究所 Unmanned aerial vehicle investigation load three-dimensional simulation scanning area calculation method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115331131A (en) * 2022-10-17 2022-11-11 四川腾盾科技有限公司 Unmanned aerial vehicle mission planning auxiliary decision-making method
CN115331131B (en) * 2022-10-17 2023-02-17 四川腾盾科技有限公司 Unmanned aerial vehicle mission planning auxiliary decision-making method
CN116561835A (en) * 2023-07-05 2023-08-08 成都纵横自动化技术股份有限公司 Task planning method and system based on construction of ground projection geometric model
CN116561835B (en) * 2023-07-05 2023-10-13 成都纵横自动化技术股份有限公司 Task planning method and system based on construction of ground projection geometric model
CN116701827A (en) * 2023-08-03 2023-09-05 中科星图测控技术股份有限公司 Quick topographic occlusion calculating method for space-sky target visibility analysis
CN116701827B (en) * 2023-08-03 2023-10-27 中科星图测控技术股份有限公司 Quick topographic occlusion calculating method for space-sky target visibility analysis
CN117156111A (en) * 2023-10-27 2023-12-01 长春长光睿视光电技术有限责任公司 Coverage planning method of wide-area photoelectric imaging system based on static platform

Also Published As

Publication number Publication date
CN114490815B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
CN114490815A (en) Unmanned aerial vehicle task load reconnaissance coverage rate calculation method based on terrain visibility
Ruzgienė et al. The surface modelling based on UAV Photogrammetry and qualitative estimation
US8600589B2 (en) Point cloud visualization of acceptable helicopter landing zones based on 4D LIDAR
US20160253808A1 (en) Determination of object data by template-based uav control
US20020183922A1 (en) Route planner with area avoidance capability
CN102455185B (en) Flight planning method for airborne synthetic aperture radar
Saadatseresht et al. UAV photogrammetry: a practical solution for challenging mapping projects
AU2012202966A1 (en) Method for pilot assistance for the landing of and aircraft in restricted visibility
CN111829964B (en) Distributed remote sensing satellite system
CN104851322A (en) Low-altitude flight target warning system and low-altitude flight target warning method based on Beidou satellite navigation system
CN107833279B (en) DEM-based terrain slope analysis method
CN107478233B (en) A kind of geological prospecting path planning method and system
CN114721436A (en) Automatic air route planning method for unmanned aerial vehicle-mounted hyperspectral imaging system
CN108716919A (en) Plant protection drone path planning method based on arbitrary polygon clear area
CN104406589A (en) Flight method of aircraft passing through radar area
CN107783157A (en) External sort algorithm intelligence satellite selection method, system and radar platform based on aeronautical satellite
CN111006645A (en) Unmanned aerial vehicle surveying and mapping method based on motion and structure reconstruction
ABULRAHMAN Evaluation of UAV-based DEM for volume calculation
US11361502B2 (en) Methods and systems for obtaining aerial imagery for use in geospatial surveying
CN110487251B (en) Operation method for carrying out large-scale mapping by using unmanned aerial vehicle without measuring camera
KR20190004983A (en) Method and apparatus for providing digital moving map service for safe navigation of unmanned aerial vehicle
CN115018973A (en) Low-altitude unmanned-machine point cloud modeling precision target-free evaluation method
US7907132B1 (en) Egocentric display
Lubczonek Location determination of radar sensors by using LIDAR data
CN110082766A (en) A kind of carried SAR data of multiple angles acquisition methods of pinpoint target tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant