CN115793715B - Unmanned aerial vehicle auxiliary flight method, system, device and storage medium - Google Patents

Unmanned aerial vehicle auxiliary flight method, system, device and storage medium Download PDF

Info

Publication number
CN115793715B
CN115793715B CN202310010070.4A CN202310010070A CN115793715B CN 115793715 B CN115793715 B CN 115793715B CN 202310010070 A CN202310010070 A CN 202310010070A CN 115793715 B CN115793715 B CN 115793715B
Authority
CN
China
Prior art keywords
flight
unmanned aerial
aerial vehicle
information
collision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310010070.4A
Other languages
Chinese (zh)
Other versions
CN115793715A (en
Inventor
王利
马继生
许洪波
刘明
马晓彪
陈航
杨永坡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiongan Xiongchuang Digital Technology Co ltd
Original Assignee
Xiongan Xiongchuang Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiongan Xiongchuang Digital Technology Co ltd filed Critical Xiongan Xiongchuang Digital Technology Co ltd
Priority to CN202310010070.4A priority Critical patent/CN115793715B/en
Publication of CN115793715A publication Critical patent/CN115793715A/en
Application granted granted Critical
Publication of CN115793715B publication Critical patent/CN115793715B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application provides an unmanned aerial vehicle auxiliary flight method, a system, a device and a storage medium, wherein the method comprises the following steps: receiving a navigation request of an unmanned aerial vehicle navigation application terminal; responding to a navigation request, and acquiring flight information of the unmanned aerial vehicle; acquiring unmanned aerial vehicle limited flight information and a target flight scene model according to the flight information; and generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly. Therefore, the problem that unmanned aerial vehicle half-way is intercepted due to blind auxiliary flight under the condition that unmanned aerial vehicle flight limiting information is uncertain can be avoided by determining unmanned aerial vehicle flight limiting information in the flight area, navigation accuracy is improved, and the target flight scene model in the flight area is determined through flight information, so that auxiliary flight is carried out on the unmanned aerial vehicle through a three-dimensional scene, the situation that the unmanned aerial vehicle cannot take off due to special reasons such as weather is avoided, and flexibility and agility of the unmanned aerial vehicle are improved.

Description

Unmanned aerial vehicle auxiliary flight method, system, device and storage medium
Technical Field
The application relates to unmanned aerial vehicle navigation technology, in particular to an unmanned aerial vehicle auxiliary flight method, system, device and storage medium.
Background
Along with the development of the age and the technological progress, unmanned aerial vehicles play an increasingly important role in urban operation, and unmanned aerial vehicle inspection and transportation are becoming normal, so that accurate navigation is provided for unmanned aerial vehicles in cities, and the unmanned aerial vehicles are imperative. The traditional unmanned aerial vehicle flight needs the operator to be in same space with unmanned aerial vehicle, judges unmanned aerial vehicle position and direction of advance through the vision, if meet the unsuitable scene of going out of weather such as sleet, smog, dust, then has restricted unmanned aerial vehicle's flight ability greatly. Although a method for assisting unmanned aerial vehicle to fly through a two-dimensional map exists at present, the two-dimensional map only comprises two-dimensional terrain and road data, lacks building position, height and urban-level oblique photography data, lacks regional information of no-fly zones, limited-height zones and the like, and cannot meet the accurate unmanned aerial vehicle navigation requirement.
Disclosure of Invention
The embodiment of the application provides an unmanned aerial vehicle auxiliary flight method, an unmanned aerial vehicle auxiliary flight system, an unmanned aerial vehicle auxiliary flight device and a storage medium.
According to a first aspect of the present application, there is provided a method for unmanned aerial vehicle assisted flight, the method comprising: receiving a navigation request of an unmanned aerial vehicle navigation application terminal; responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information; acquiring unmanned aerial vehicle limited flight information and a target flight scene model according to the flight information; and generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly.
According to an embodiment of the present application, the navigation request carries information of the unmanned aerial vehicle device and information of the user; correspondingly, before the navigation request is responded, the flight information of the unmanned aerial vehicle is acquired, the method further comprises: judging whether the unmanned aerial vehicle requesting flight navigation accords with the flight authority authentication regulation or not according to the unmanned aerial vehicle equipment information and the user information; and under the condition that the unmanned aerial vehicle accords with the flight permission authentication regulation, responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle.
According to an embodiment of the application, the unmanned aerial vehicle flight limiting information comprises flight forbidden information and height limiting information; correspondingly, the acquiring the unmanned aerial vehicle limited flight information and the target flight scene model according to the flight information comprises the following steps: and determining the information of forbidden flight and the information of limited flight of the area where the unmanned aerial vehicle passes and a target flight scene model corresponding to the area where the unmanned aerial vehicle passes according to the flight time stamp information, the flight starting point information and the flight ending point information.
According to an embodiment of the present application, the generating, according to the unmanned aerial vehicle flight restriction information and the flight information, a unmanned aerial vehicle navigation path based on the target flight scene model includes: generating a flight vector based on the flight start point information and the flight end point information; taking the height limiting information, the flight forbidden information and a solid model in the target flight scene model as a collision body, and performing collision detection on the flight vector according to the collision body to obtain a target collision body; and generating the unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information.
According to an embodiment of the present application, the collision detection is performed on the flight vector according to the collision body, to obtain a target collision body, including: calculating a bounding sphere of the collision volume; performing first collision detection on the flight vector according to the surrounding ball, and removing a collision body which does not collide with the flight vector to obtain a first collision body; calculating an axis alignment bounding box of the first collision volume; performing second collision detection on the flight vector according to the axis alignment bounding box, and removing collision bodies which do not collide with the flight vector in the first collision bodies to obtain second collision bodies; calculating a directed bounding box of the second collision volume; and determining a collision body colliding with the flight vector in the second collision body according to the directed bounding box to obtain a target collision body.
According to an embodiment of the present application, the method further comprises: determining whether the unmanned aerial vehicle navigation path passes through a flight exclusion zone according to the starting point coordinates and the ending point coordinates of the unmanned aerial vehicle navigation path; under the condition of passing through the flying forbidden zone, obtaining forbidden certificate information; and providing an unmanned aerial vehicle navigation path for the unmanned aerial vehicle navigation application terminal under the condition that the forbidden certificate information accords with forbidden regulations.
According to a second aspect of the present application, there is provided a unmanned aerial vehicle assisted flight system, the system comprising: the navigation platform is used for receiving a navigation request of the unmanned aerial vehicle navigation application terminal; responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information; the CIM platform is used for acquiring the unmanned aerial vehicle limited flight information and the target flight scene model according to the flight information; generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly; wherein, CIM platform with navigation platform both-way communication.
According to a third aspect of the present application, there is provided an unmanned aerial vehicle auxiliary flying device, the device comprising: the receiving module is used for receiving a navigation request of the unmanned aerial vehicle navigation application terminal; the first acquisition module is used for responding to the navigation request and acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information; the second acquisition module is used for acquiring the unmanned aerial vehicle limited flight information and the target flight scene model according to the flight information; and the path determining module is used for generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly.
According to an embodiment of the present application, the path determining module includes: the generation sub-module is used for generating a flight vector based on the flight starting point information and the flight ending point information; the collision detection sub-module is used for taking the height limit information, the flight prohibition information and the entity model in the target flight scene model as a collision body, and carrying out collision detection on the flight vector according to the collision body to obtain a target collision body; and the path determination submodule is used for generating the unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information.
According to a fourth aspect of the present application, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the above-described unmanned aerial vehicle assisted flight method.
According to the method, a navigation request of an unmanned aerial vehicle navigation application terminal is received; responding to a navigation request, and acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information; acquiring unmanned aerial vehicle limited flight information and a target flight scene model according to the flight information; and generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly. Therefore, the unmanned aerial vehicle flight limiting information of the area where the flight is located can be determined based on the flight information, the problem that the unmanned aerial vehicle is blocked due to blind auxiliary flight under the condition that the unmanned aerial vehicle flight limiting information is not determined is avoided, so that more accurate navigation is improved, the target flight scene model of the area where the flight is located is determined through the flight information, the unmanned aerial vehicle is subjected to auxiliary flight through a three-dimensional scene, the situation that the unmanned aerial vehicle cannot take off due to special reasons such as weather is avoided, and the flexibility and the agility of the unmanned aerial vehicle are improved.
It should be understood that the teachings of the present application are not required to achieve all of the above-described benefits, but rather that certain technical solutions may achieve certain technical effects, and that other embodiments of the present application may also achieve benefits not mentioned above.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Fig. 1 shows a schematic implementation flow chart of an auxiliary flight method of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 2 is a schematic implementation flow diagram of a navigation request authentication method of the unmanned aerial vehicle auxiliary flight method according to an embodiment of the present application;
fig. 3 is a schematic implementation flow diagram of a navigation path determining method of the unmanned aerial vehicle auxiliary flight method according to an embodiment of the present application;
fig. 4 is a schematic implementation flow diagram of a target collision body determining method of the unmanned aerial vehicle auxiliary flight method according to the embodiment of the application;
fig. 5 shows a schematic diagram of a composition structure of an auxiliary flying device of the unmanned aerial vehicle according to an embodiment of the present application;
Fig. 6 shows a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present application more obvious and understandable, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
First, before describing embodiments of the present application in further detail, terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application are suitable for the following explanation.
CIM (City Information Modeling, city information model): based on BIM (Building Information Modeling, building information model), GIS (GeographicInformation Systems, geographic information system) and IoT (Internet of Things ) technologies, urban overground, underground, indoor, outdoor, history, current situation and future multidimensional multi-scale information model data and urban perception data are integrated, and a three-dimensional digital space urban information complex is constructed.
BIM refers to new tools for architecture, engineering and civil engineering, and is used to address computer aided design mainly based on three-dimensional graphics, object guiding and architecture.
IoT refers to collecting any object or process needing to be monitored, connected and interacted in real time through various devices and technologies such as various information sensors, radio frequency identification technologies, global positioning systems, infrared sensors, laser scanners, and the like, collecting various needed information such as sound, light, heat, electricity, mechanics, chemistry, biology, positions and the like, accessing through various possible networks, realizing ubiquitous connection of objects and people, and realizing intelligent perception, identification and management of objects and processes.
Fig. 1 shows a schematic implementation flow chart of an auxiliary flight method of an unmanned aerial vehicle according to an embodiment of the application.
Referring to fig. 1, an unmanned aerial vehicle auxiliary flight method in the embodiment of the application includes: operation 101, receiving a navigation request of an unmanned aerial vehicle navigation application terminal; operation 102, in response to a navigation request, acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight start point information and flight end point information; operation 103, obtaining unmanned plane limited flight information and a target flight scene model according to flight information; and 104, generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly.
In operation 101, a navigation request of a navigation application terminal of an unmanned aerial vehicle is received.
Firstly, a navigation request of an unmanned aerial vehicle navigation application terminal is received, wherein the unmanned aerial vehicle navigation application terminal refers to an application terminal needing unmanned aerial vehicle auxiliary flight, and can be unmanned aerial vehicle operators, companies to which the unmanned aerial vehicle belongs or the unmanned aerial vehicle.
Specifically, when the user needs to control the unmanned aerial vehicle to reach a specified place, the unmanned aerial vehicle needs to navigate so as to define the flight path of the unmanned aerial vehicle, and when the user needs to navigate the unmanned aerial vehicle, the user can input a navigation request through terminal equipment such as a mobile phone or a computer. The user described herein is a navigation application terminal, which may be an unmanned aerial vehicle operator, a company to which the unmanned aerial vehicle belongs, or the unmanned aerial vehicle itself.
Further, after the unmanned aerial vehicle navigation application terminal inputs the navigation request, the navigation request is received.
In operation 102, flight information of the unmanned aerial vehicle is acquired in response to the navigation request, the flight information including flight time stamp information, flight start point information, and flight end point information.
Specifically, in response to a navigation request, first, flight time stamp information, flight start point information and flight end point information of the unmanned aerial vehicle are required to be acquired, so as to assist the unmanned aerial vehicle in flying.
In operation 103, unmanned aerial vehicle limited flight information and a target flight scene model are acquired according to flight information.
Specifically, the city has unmanned aerial vehicle limited flight information, the unmanned aerial vehicle limited flight information can comprise limited flight information and limited height information, the limited flight information comprises a limited flight area, and the limited height information comprises a limited height area. Under the condition that the flight time, the starting point and the end point of the unmanned aerial vehicle are determined, the area where the unmanned aerial vehicle flies can be determined through the flight starting point information and the flight end point information of the unmanned aerial vehicle, and the flight forbidden information and the height limiting information of the current area are obtained. The unmanned aerial vehicle flight limiting information can be acquired through communication with relevant departments such as civil aviation bureau and the like, so that the latest unmanned aerial vehicle flight limiting information of the area where the unmanned aerial vehicle flies is acquired. Therefore, compared with the scheme that information such as a no-fly area and a height limiting area does not exist in the auxiliary flight of the traditional unmanned aerial vehicle, the method and the device provide basic data for unmanned aerial vehicle navigation by acquiring the information such as the no-fly area and the height limiting area, and can provide auxiliary flight service for the unmanned aerial vehicle more accurately.
Furthermore, a target flight scene model of the area where the unmanned aerial vehicle flies needs to be obtained through the CIM platform.
The CIM platform is integrated with data such as urban geographic information models, building plan views, oblique photography models and the like, and can truly feed back the basic structure and size data of buildings, pipe galleries, gardens and the like in cities. After the flight start information and the flight end information are used for determining the flight area, the CIM platform internal model data can be subjected to grid division to obtain a target flight scene model, and the target flight scene model comprises relevant data of the unmanned aerial vehicle flight area. Therefore, by adopting the data of partial areas of the CIM platform, only the scene data of the area where the flight is located need to be loaded to obtain the target flight scene model in the process of acquiring the data through the CIM platform, the rendering efficiency is improved, and the efficiency of unmanned aerial vehicle auxiliary flight is improved.
In operation 104, an unmanned aerial vehicle navigation path is generated based on the target flight scene model according to unmanned aerial vehicle flight restriction information and flight information to assist unmanned aerial vehicle flight.
Specifically, the target flight scene model is a live-action three-dimensional map data model, and under the condition that the unmanned aerial vehicle limited flight information and the target flight scene model are determined, the navigation capacity of the CIM platform can be used for determining the navigation path of the unmanned aerial vehicle based on the target flight scene model. The unmanned aerial vehicle navigation path does not pass through a no-fly area and a height limiting area shown by unmanned aerial vehicle limited-flight information. From this, with the help of unmanned aerial vehicle flight of CIM platform assistance, CIM platform can provide the three-dimensional live-action condition of big scene in order to assist unmanned aerial vehicle flight, need unmanned aerial vehicle driver to carry out unmanned aerial vehicle operation's defect through the live-action or the two-dimensional map that unmanned aerial vehicle's field of view obtained when having avoided present unmanned aerial vehicle flight, liberated the manpower, improved unmanned aerial vehicle carrying efficiency, reduced running cost, and effectively solved the problem of restricting unmanned aerial vehicle flight ability in flight environment such as dark, sleet, smog, avoid because can't confirm the problem that course and command flight brought through the naked eye.
Fig. 2 is a schematic implementation flow chart of a navigation request authentication method of the unmanned aerial vehicle auxiliary flight method according to the embodiment of the application.
Referring to fig. 2, in an embodiment of the present application, the navigation request carries the information of the unmanned aerial vehicle device and the user information, and before the flight information of the unmanned aerial vehicle is acquired in response to the navigation request, the method further includes: operation 201, judging whether the unmanned aerial vehicle requesting flight navigation accords with flight authority authentication regulations according to unmanned aerial vehicle equipment information and user information; operation 202, under the condition that the unmanned aerial vehicle accords with the flight authority authentication regulation, responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle.
In operation 201, it is determined whether the unmanned aerial vehicle requesting flight navigation meets the flight authority authentication specification according to the unmanned aerial vehicle device information and the user information.
Specifically, in order to stop the occurrence of events such as private flight, peeping monitoring, theft and theft, after the unmanned aerial vehicle navigation application terminal carries out a navigation request, flight permission authentication is required to be carried out according to user information and unmanned aerial vehicle equipment flight information in the navigation request. And providing auxiliary flight service under the condition that the unmanned aerial vehicle meets the flight authority authentication regulation. The unmanned aerial vehicle authority authentication rule may include that the unmanned aerial vehicle has a license for operation and the user has a license for flying, etc.
Further, under the condition that the unmanned aerial vehicle accords with unmanned aerial vehicle flight permission authentication, user information and unmanned aerial vehicle equipment information are stored. From this, through carrying out authority authentication to unmanned aerial vehicle, only carry out auxiliary flight to the unmanned aerial vehicle that accords with flight authority authentication regulation to record unmanned aerial vehicle's relevant information is followed for the city unmanned aerial vehicle manager and is looked over, can assist to supervise unmanned aerial vehicle flight, avoids the behaviors such as the unmanned aerial vehicle black flies, the messy flight between the city, the unmanned aerial vehicle flight management between the reinforcing city.
In operation 202, in a case where the unmanned aerial vehicle meets the flight authority authentication specification, flight information of the unmanned aerial vehicle is acquired in response to the navigation request.
Specifically, under the condition that the unmanned aerial vehicle meets the permission authentication regulation of the unmanned aerial vehicle, acquiring flight information of the unmanned aerial vehicle is started to navigate the unmanned aerial vehicle.
In an embodiment of the application, the unmanned aerial vehicle limited flight information includes flight prohibition information and height limiting information, and the flight prohibition information and the height limiting information of the area where the unmanned aerial vehicle passes can be determined according to flight time stamp information, flight start point information and flight end point information, and a target flight scene model corresponding to the area where the unmanned aerial vehicle passes is determined.
Unmanned aerial vehicles are prohibited from flying in no-fly areas, and the unmanned aerial vehicles need to pay attention to flying heights in important areas such as flight control departments, important targets, government authorities and the like or densely populated areas such as squares, bus stops and the like, and have height limiting limits. Therefore, in order to better assist the unmanned aerial vehicle in flying, it is necessary to acquire real-time height limit information and flight prohibition information of the flight area required by the unmanned aerial vehicle.
Further, the civil aviation bureau and related departments authorized by the civil aviation bureau externally publish the flight control information and the height limiting information of the unmanned aerial vehicle, and update the flight control information and the height limiting information when the flight control information and the height limiting information change. In this way, the height limit information and the flight control information can be obtained from civil aviation bureaus and other related departments. The height limiting information and the flight prohibiting information are acquired through regular channels, and meet relevant legal regulations.
Further, the live-action three-dimensional map data model corresponding to the area where the flight passes in the CIM platform can be determined to be the target flight scene model.
Fig. 3 is a schematic implementation flow diagram of a navigation path determining method of the unmanned aerial vehicle auxiliary flight method according to an embodiment of the present application.
Referring to fig. 3, the following operations may be employed to generate a drone navigation path: operation 301, generating a flight vector based on flight start point information and flight end point information; an operation 302, taking entity models in the height limiting information, the flight forbidden information and the target flight scene model as collision bodies, and performing collision detection on flight vectors according to the collision bodies to obtain target collision bodies; in operation 303, a navigation path of the unmanned aerial vehicle is generated based on the target flight scene model according to the target collision body and the flight information.
Specifically, in order to avoid that the generated unmanned aerial vehicle navigation path passes through entities such as a height limiting area, a no-fly area, a building and the like, the problem that the unmanned aerial vehicle navigation path cannot pass through the entities such as the height limiting area, the no-fly area, the building and the like can be converted into the problem that the line segment collides with the object, and the unmanned aerial vehicle navigation path which does not collide with the entities such as the height limiting area, the no-fly area, the building and the like is generated.
In operation 301, a flight vector is generated based on flight start point information and flight end point information.
Specifically, the flight start point information comprises flight start point coordinates, the flight end point information comprises flight end point coordinates, and a flight vector is generated in a target flight scene model provided by the CIM platform according to the flight start point coordinates and the flight end point coordinates, and the flight vector is from the flight start point to the flight end point.
In operation 302, the entity model in the height limit information, the flight prohibition information and the target flight scene model is used as a collision body, and the target collision body is obtained by performing collision detection on the flight vector according to the collision body.
Specifically, the unmanned aerial vehicle navigation path cannot pass through an entity, a no-fly area shown by no-fly information, a height-limited area shown by height-limited information, and the like. Thus, all solid models, limited-height regions, and no-fly regions in the target flight scene model can be defined as collision volumes. The solid model in the target flight scene model is a model of any entity in the three-dimensional real scene, and may include, for example, a plant solid model, a building solid model, a solid model of a natural object such as a mountain, and the like.
Further, after determining the collision body in the target flight scene model, the collision body having a collision with the flight vector can be determined as the target collision body by a collision detection mode. It should be noted that the conventional collision detection method can realize the determination of the target collision body, and therefore, no description is repeated.
In operation 303, a drone navigation path is generated based on the target flight scene model from the target collision volume and the flight information.
Specifically, after a target collision body which collides with the flight vector is determined, based on a target flight scene model, determining that a flight path from a flight start point to a flight end point does not collide with the collision body, and obtaining the unmanned plane navigation path.
Fig. 4 is a schematic implementation flow diagram of a target collision body determining method of the unmanned aerial vehicle auxiliary flight method according to the embodiment of the application.
Referring to fig. 4, performing collision detection on a flight vector according to a collision body to obtain a target collision body, including: operation 401, calculating a bounding sphere of a collision volume; operation 402, performing first collision detection on the flight vector according to the enclosing ball, and removing collision bodies which do not collide with the flight vector to obtain first collision bodies; operation 403 of calculating an axis alignment bounding box of the first collision volume; operation 404, performing second collision detection on the flight vector according to the axis alignment bounding box, and removing collision bodies which do not collide with the flight vector in the first collision bodies to obtain second collision bodies; operation 405, calculating a directed bounding box of the second collision volume; in operation 406, a collision body of the second collision body, which collides with the flight vector, is determined according to the directed bounding box, and a target collision body is obtained.
In operation 401, a bounding sphere of a collision volume is calculated.
Specifically, before determining the collision volumes that collide with the flight vectors, it is necessary to reject some collision volumes that do not collide with the flight vectors in advance. First, bounding balls of a plurality of solid models, no-fly zones, and altitude-limit zones within a target flight scene model may be calculated based on the target flight scene model.
In operation 402, a first collision detection is performed on the flight vector based on the bounding sphere, and a collision volume that does not collide with the flight vector is removed, resulting in a first collision volume.
Specifically, after the bounding sphere of the collision object in the target scene model is calculated, the collision object corresponding to the bounding sphere which does not collide with the flight vector is removed, and the rest collision objects are obtained and are collectively called as first collision objects. The specific first collision detection process may refer to the sphere collision detection method in the collision detection manner, and thus will not be described herein.
In operation 403, an axis-aligned bounding box of the first collision volume is calculated.
After the first collision detection, the collision body which does not collide with the flight vector is not completely removed, and further collision detection is required.
To simplify the collision detection operation between objects, a regular geometric shape is usually created for the objects to enclose them, in a manner known as AABB (axis-aligned bounding box) technique. Therefore, through the AABB technology, based on the target flight scene model, the shaft alignment bounding boxes of a plurality of entity models, no-fly areas and height-limited areas in the target flight scene model can be calculated.
Operation 404, performing second collision detection on the flight vector according to the axis alignment bounding box, and removing collision bodies which do not collide with the flight vector in the first collision bodies to obtain second collision bodies;
specifically, after the axis alignment bounding box of the first collision body in the target scene model is calculated, collision bodies corresponding to the axis alignment bounding box which does not collide with the flight vector are removed, and the remaining collision bodies are obtained and are collectively called as second collision bodies. The specific second collision detection process may refer to an existing collision detection process by the AABB technology, and will not be described herein.
In operation 405, a directed bounding box of the second collision volume is calculated.
Specifically, collision volumes rejected by the AABB technique are some collision volumes relatively far from the flight vector, while others that are closer to the flight vector need to be rejected.
OBB (Oriented Bounding Box, directed bounding box or directional bounding box) which will move, scale, rotate with the object, can be considered as an AABB bounding box capable of rotating. Compared with AABB, OBB is higher in collision accuracy than AABB.
Further, the directional bounding boxes are calculated for the second collision volumes screened out through the second collision detection, respectively.
In operation 406, a collision volume of the second collision volume that collides with the flight vector is determined from the directed bounding box, resulting in a target collision volume.
Specifically, after the directed bounding box of the second collision body in the target scene model is calculated, the collision body corresponding to the directed bounding box which does not collide with the flight vector is removed, and the target collision body is obtained. The specific second collision detection process may refer to an existing collision detection process by the OBB technology, which is not described herein in detail.
In an embodiment of the present application, after determining the target collision volume, the separation theorem may be combined, and the unmanned aerial vehicle navigation path may be generated based on a CH algorithm, where the CH algorithm is an acceleration technique for finding the shortest path in the graph.
Specifically, a specific collision point for collision with a collision body in a flight vector can be determined by combining a separation axis theorem, and after the collision point is determined, a navigation path of the unmanned aerial vehicle can be generated based on a CH algorithm. The determination method of the collision point may refer to the separation axis theorem, and the generation method of the unmanned aerial vehicle navigation path may refer to the CH algorithm, so that details are not repeated here.
The method or algorithm for generating the unmanned aerial vehicle navigation path according to the target collision body is not limited, and any method capable of generating the unmanned aerial vehicle navigation path according to the target collision body belongs to the protection scope of the application.
In an embodiment of the present application, it is further determined whether the unmanned aerial vehicle navigation path passes through the no-fly zone according to the start point coordinates and the end point coordinates of the unmanned aerial vehicle navigation path. Under the condition of passing through the flight restricted zone, obtaining forbidden certificate information, and providing an unmanned aerial vehicle navigation path for an unmanned aerial vehicle navigation application terminal under the condition that the forbidden certificate information accords with forbidden regulations.
Specifically, after the unmanned aerial vehicle navigation path is generated, whether the unmanned aerial vehicle navigation path passes through the no-fly zone needs to be determined again, and whether the unmanned aerial vehicle navigation path passes through the no-fly zone can be further determined through the starting point coordinates and the ending point coordinates of the unmanned aerial vehicle navigation path. If the unmanned aerial vehicle does not pass through the no-fly zone, the current unmanned aerial vehicle navigation path is directly provided for the unmanned aerial vehicle navigation application terminal.
Further, the generated unmanned aerial vehicle navigation path may be in a condition of passing through a forbidden zone, and under the condition, forbidden certificate information corresponding to the forbidden zone can be obtained. Under the condition that the forbidden certificate information exists and the forbidden certificate information is not out of date, the forbidden regulations are confirmed to be met, the forbidden limit of the forbidden zone can be released, and the current unmanned aerial vehicle navigation path is provided for the unmanned aerial vehicle navigation application terminal.
And if the unmanned aerial vehicle passes through the no-fly zone and the forbidden certificate information does not accord with the forbidden regulation, the unmanned aerial vehicle navigation path needs to be planned again.
In a specific application example of the present application, an unmanned aerial vehicle auxiliary flight method may be implemented through a navigation platform and a CIM platform, where the navigation platform and the CIM platform may be in two-way communication, and accordingly, the unmanned aerial vehicle auxiliary flight method of the specific application example of the present application includes: the unmanned aerial vehicle navigation application terminal logs in a navigation platform, the navigation platform verifies the identity and the unmanned aerial vehicle operation license, re-authenticates the unmanned aerial vehicle under the condition that the verification is not passed, acquires flight start point information and flight end point information under the condition that the verification is passed, and generates flight activity authorization information; the navigation platform acquires the latest limited flight area and sends the latest limited flight area to the CIM platform so as to update the limited flight area in the CIM platform; the navigation platform sends the activity authorization information, the flight starting point information and the unmanned aerial vehicle terminal point information to the CIM platform; the CIM platform generates a navigation path of the unmanned aerial vehicle, judges whether a no-fly zone exists or a forbidden certificate exists, re-enters the forbidden certificate or changes a destination under the condition that the no-fly zone exists and the forbidden certificate of the no-fly zone exists, sends navigation data comprising the navigation path of the unmanned aerial vehicle to the navigation platform under the condition that the no-fly zone exists or the forbidden certificate exists, and records a flight log.
Therefore, based on the data service and navigation capability of the CIM platform, the unmanned aerial vehicle can be more accurately flown in an auxiliary mode. And based on the application, all unmanned aerial vehicle flight behaviors in the city are required to be applied and reported through the unmanned aerial vehicle navigation platform, and the CIM platform is used for collecting data uniformly and planning the unmanned aerial vehicle navigation path, so that the unmanned aerial vehicle flight behaviors in the city can be planned, supervised and managed well, and black flight is reduced or even stopped.
Based on the unmanned aerial vehicle auxiliary flight method, the embodiment of the application also provides an unmanned aerial vehicle auxiliary flight system, which comprises a CIM platform and a navigation platform which are in two-way communication. The navigation platform is used for receiving a navigation request of the unmanned aerial vehicle navigation application terminal, and responding to the navigation request to acquire flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight start point information and flight end point information; and the CIM platform is used for acquiring the unmanned aerial vehicle flight limiting information and the target flight scene model according to the flight information, and generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly.
Specifically, the navigation platform is responsible for providing an open user registration function, supporting authentication of unmanned aerial vehicle operation licenses, supporting addition or association of unmanned aerial vehicle equipment, supporting synchronization of data such as a restricted flight zone, a restricted high zone and the like from civil aviation offices and other related departments, supporting authentication of forbidden certificates and performing flight registration.
When a user logs in the navigation platform and initiates a navigation request, authorization authentication information is generated according to the user information and the unmanned aerial vehicle equipment information of the user applying for navigation and is used as authorization information of the unmanned aerial vehicle flight activity, and the authorization information is recorded in the navigation platform, so that the follow-up urban unmanned aerial vehicle flight statistics and management are facilitated. The navigation platform comprises the functions of synchronizing and editing restricted areas such as a restricted flight zone, a restricted height zone and the like, and can communicate with relevant departments such as civil aviation bureau and the like to acquire the information of the latest unmanned aerial vehicle flight restricted area in the area. After the navigation platform updates the unmanned aerial vehicle flight restriction area information, the unmanned aerial vehicle flight restriction area information is synchronized to the CIM platform for navigation calculation by the subsequent CIM platform.
After the navigation platform completes the authentication of the navigation request, the navigation request is sent to the CIM platform, and the navigation request contains the authorization information of the unmanned aerial vehicle flight activity, the flight time stamp information, the flight start point information, the flight end point information and the like. After the CIM platform acquires the navigation request, according to the flight start point information and the flight end point information, the unmanned aerial vehicle flight limiting area information is combined, the unmanned aerial vehicle navigation path is generated by means of a self route planning algorithm, and the unmanned aerial vehicle navigation path is returned to the navigation platform.
From this, combine the ability of CIM platform, can be more accurate carry out auxiliary flight to unmanned aerial vehicle flight all needs through navigation platform's authentication, can plan well, supervise, manage unmanned aerial vehicle flight behavior in the city, reduce even stop unmanned aerial vehicle black flight behavior.
Fig. 5 shows a schematic diagram of the composition structure of the unmanned aerial vehicle auxiliary flying device according to the embodiment of the application.
Based on the above method for unmanned aerial vehicle auxiliary flight, the embodiment of the application also provides an unmanned aerial vehicle auxiliary flight device, and the device 50 includes: a receiving module 501, configured to receive a navigation request from a navigation application end of an unmanned aerial vehicle; the first obtaining module 502 is configured to obtain flight information of the unmanned aerial vehicle in response to the navigation request, where the flight information includes flight time stamp information, flight start point information, and flight end point information; a second obtaining module 503, configured to obtain flight restriction information of the unmanned aerial vehicle and a target flight scene model according to the flight information; the path determining module 504 is configured to generate, according to the unmanned aerial vehicle flight restriction information and the flight information, a unmanned aerial vehicle navigation path based on the target flight scene model, so as to assist the unmanned aerial vehicle to fly.
In an embodiment of the present application, the unmanned aerial vehicle navigation request carries unmanned aerial vehicle equipment information and user information; correspondingly, the device also comprises a judging module, which is used for judging whether the unmanned aerial vehicle requesting flight navigation accords with the flight authority authentication regulation according to the unmanned aerial vehicle equipment information and the user information; under the condition that the unmanned aerial vehicle accords with the flight authority authentication regulation, responding to the unmanned aerial vehicle navigation request, and acquiring flight information of the unmanned aerial vehicle.
In an embodiment of the present application, the unmanned aerial vehicle flight limiting information includes flight forbidden information and height limiting information; correspondingly, the second obtaining module 503 is configured to determine, according to the flight time stamp information, the flight start point information, and the flight end point information, flight prohibition information and height limit information of an area where the unmanned aerial vehicle passes, and a target flight scene model corresponding to the area where the unmanned aerial vehicle passes.
In one embodiment of the present application, the path determining module 504 includes: the generating sub-module is used for generating a flight vector based on the flight start point information and the flight end point information; the collision detection sub-module is used for taking the height limit information, the flight prohibition information and the entity model in the target flight scene model as a collision body, and carrying out collision detection on the flight vector according to the collision body to obtain a target collision body; and the path determination submodule is used for generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information.
In one embodiment of the present application, the collision detection submodule includes: a surrounding sphere calculation unit for calculating a surrounding sphere of the collision body; the first collision detection unit is used for carrying out first collision detection on the flight vector according to the surrounding ball, and removing collision bodies which do not collide with the flight vector to obtain first collision bodies; an axis alignment bounding box calculation unit for calculating an axis alignment bounding box of the first collision body; the second collision detection unit is used for carrying out second collision detection on the flight vector according to the axis alignment bounding box, and removing collision bodies which do not collide with the flight vector in the first collision bodies to obtain second collision bodies; a directional bounding box calculation unit for calculating a directional bounding box of the second collision volume; and the target collision body determining unit is used for determining a collision body colliding with the flight vector in the second collision body according to the directed bounding box to obtain the target collision body.
In an embodiment of the present application, the apparatus 50 further includes: the no-fly zone determining module is used for determining whether the unmanned aerial vehicle navigation path passes through the flying no-fly zone according to the starting point coordinates and the ending point coordinates of the unmanned aerial vehicle navigation path; the forbidden certificate information acquisition module is used for acquiring forbidden certificate information under the condition of passing through the flying forbidden zone; the navigation providing module is used for providing the unmanned aerial vehicle navigation path for the flight navigation application terminal under the condition that the forbidden certificate information accords with the forbidden regulations.
It should be noted that, the description of the apparatus in the embodiment of the present application is similar to the description of the embodiment of the method described above, and has similar beneficial effects as the embodiment of the method, so that a detailed description is omitted. The technical details of the unmanned aerial vehicle auxiliary flying device provided in the embodiment of the present application may be understood according to the description of any one of fig. 1 to 4.
According to embodiments of the present application, there is also provided an electronic device and a non-transitory computer-readable storage medium.
Fig. 6 shows a schematic block diagram of an example electronic device 60 that may be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 6, the electronic device 60 includes a computing unit 601 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the electronic device 60 can also be stored. The computing unit 601, ROM 602, and RAM 603 are connected to each other by a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the electronic device 60 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, mouse, etc.; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the electronic device 60 to exchange information/data with other devices through a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 601 performs the various methods and processes described above, such as the unmanned aerial vehicle assisted flight method. For example, in some embodiments, the unmanned aerial vehicle assisted flight method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 60 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into RAM 603 and executed by the computing unit 601, one or more steps of the unmanned aerial vehicle assisted flight method described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the unmanned aerial vehicle assisted flight method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present application may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A method of unmanned aerial vehicle assisted flight, the method comprising:
receiving a navigation request of an unmanned aerial vehicle navigation application terminal;
responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information;
acquiring unmanned aerial vehicle flight limiting information and a target flight scene model according to the flight information, wherein the unmanned aerial vehicle flight limiting information comprises flight forbidden information and height limiting information;
Generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly;
wherein, according to the unmanned aerial vehicle limited flight information and the flight information, generating an unmanned aerial vehicle navigation path based on the target flight scene model includes:
generating a flight vector based on the flight start point information and the flight end point information;
taking the height limiting information, the flight forbidden information and a solid model in the target flight scene model as a collision body, and performing collision detection on the flight vector according to the collision body to obtain a target collision body;
generating the unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information;
performing collision detection on the flight vector according to the collision body to obtain a target collision body, including:
calculating a bounding sphere of the collision volume;
performing first collision detection on the flight vector according to the surrounding ball, and removing a collision body which does not collide with the flight vector to obtain a first collision body;
calculating an axis alignment bounding box of the first collision volume;
Performing second collision detection on the flight vector according to the axis alignment bounding box, and removing collision bodies which do not collide with the flight vector in the first collision bodies to obtain second collision bodies;
calculating a directed bounding box of the second collision volume;
and determining a collision body colliding with the flight vector in the second collision body according to the directed bounding box to obtain a target collision body.
2. The unmanned aerial vehicle aided flight method of claim 1, wherein the navigation request carries unmanned aerial vehicle equipment information and user information; in a corresponding manner,
before the acquiring the flight information of the unmanned aerial vehicle in response to the navigation request, the method further comprises:
judging whether the unmanned aerial vehicle requesting flight navigation accords with the flight authority authentication regulation or not according to the unmanned aerial vehicle equipment information and the user information;
and under the condition that the unmanned aerial vehicle accords with the flight permission authentication regulation, responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle.
3. The unmanned aerial vehicle auxiliary flight method according to claim 1, wherein the acquiring unmanned aerial vehicle limited flight information and a target flight scene model according to the flight information comprises:
And determining the information of forbidden flight and the information of limited flight of the area where the unmanned aerial vehicle passes and a target flight scene model corresponding to the area where the unmanned aerial vehicle passes according to the flight time stamp information, the flight starting point information and the flight ending point information.
4. The unmanned aerial vehicle aided flight method of claim 1, wherein the method further comprises:
determining whether the unmanned aerial vehicle navigation path passes through a flight exclusion zone according to the starting point coordinates and the ending point coordinates of the unmanned aerial vehicle navigation path;
under the condition of passing through the flying forbidden zone, obtaining forbidden certificate information;
and under the condition that the forbidden certificate information accords with forbidden regulations, providing the unmanned aerial vehicle navigation path for the unmanned aerial vehicle navigation application terminal.
5. A unmanned aerial vehicle assisted flight system for implementing the unmanned aerial vehicle assisted flight method of any of claims 1 to 4, the system comprising:
the navigation platform is used for receiving a navigation request of the unmanned aerial vehicle navigation application terminal; responding to the navigation request, and acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information;
The CIM platform is used for acquiring unmanned aerial vehicle flight limiting information and a target flight scene model according to the flight information, wherein the unmanned aerial vehicle flight limiting information comprises flight forbidden information and height limiting information; generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle flight limiting information and the flight information so as to assist the unmanned aerial vehicle to fly; wherein, according to the unmanned aerial vehicle limited flight information and the flight information, generating an unmanned aerial vehicle navigation path based on the target flight scene model includes: generating a flight vector based on the flight start point information and the flight end point information; taking the height limiting information, the flight forbidden information and a solid model in the target flight scene model as a collision body, and performing collision detection on the flight vector according to the collision body to obtain a target collision body; generating the unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information; performing collision detection on the flight vector according to the collision body to obtain a target collision body, including: calculating a bounding sphere of the collision volume; performing first collision detection on the flight vector according to the surrounding ball, and removing a collision body which does not collide with the flight vector to obtain a first collision body; calculating an axis alignment bounding box of the first collision volume; performing second collision detection on the flight vector according to the axis alignment bounding box, and removing collision bodies which do not collide with the flight vector in the first collision bodies to obtain second collision bodies; calculating a directed bounding box of the second collision volume; determining a collision body which collides with the flight vector in the second collision body according to the directed bounding box to obtain a target collision body;
Wherein, CIM platform with navigation platform both-way communication.
6. An unmanned aerial vehicle auxiliary flying device, the device comprising:
the receiving module is used for receiving a navigation request of the unmanned aerial vehicle navigation application terminal;
the first acquisition module is used for acquiring flight information of the unmanned aerial vehicle, wherein the flight information comprises flight time stamp information, flight starting point information and flight end point information;
the second acquisition module is used for acquiring unmanned aerial vehicle flight limiting information and a target flight scene model according to the flight information, wherein the unmanned aerial vehicle flight limiting information comprises flight forbidden information and height limiting information;
the path determining module is used for generating an unmanned aerial vehicle navigation path based on the target flight scene model according to the unmanned aerial vehicle limited flight information and the flight information so as to assist the unmanned aerial vehicle to fly;
wherein the path determination module comprises: the generation sub-module is used for generating a flight vector based on the flight starting point information and the flight ending point information; the collision detection sub-module is used for taking the height limit information, the flight prohibition information and the entity model in the target flight scene model as a collision body, and carrying out collision detection on the flight vector according to the collision body to obtain a target collision body; the path determination submodule is used for generating the unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information;
The collision detection submodule includes: a surrounding sphere calculation unit configured to calculate a surrounding sphere of the collision body; the first collision detection unit is used for carrying out first collision detection on the flight vector according to the surrounding ball, and removing a collision body which does not collide with the flight vector to obtain a first collision body; an axis alignment bounding box calculation unit configured to calculate an axis alignment bounding box of the first collision body; a second collision detection unit, configured to perform a second collision detection on the flight vector according to the axis alignment bounding box, and remove a collision body that does not collide with the flight vector in the first collision body, so as to obtain a second collision body; a directional bounding box calculation unit configured to calculate a directional bounding box of the second collision volume; and the target collision body determining unit is used for determining a collision body which collides with the flight vector in the second collision body according to the directed bounding box to obtain a target collision body.
7. The unmanned aerial vehicle auxiliary flying device of claim 6, wherein the path determination module comprises:
the generation sub-module is used for generating a flight vector based on the flight starting point information and the flight ending point information;
The collision detection sub-module is used for taking the height limit information, the flight prohibition information and the entity model in the target flight scene model as a collision body, and carrying out collision detection on the flight vector according to the collision body to obtain a target collision body;
and the path determination submodule is used for generating the unmanned aerial vehicle navigation path based on the target flight scene model according to the target collision body and the flight information.
8. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the unmanned aerial vehicle assisted flight method of any of claims 1-4.
CN202310010070.4A 2023-01-05 2023-01-05 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium Active CN115793715B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310010070.4A CN115793715B (en) 2023-01-05 2023-01-05 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310010070.4A CN115793715B (en) 2023-01-05 2023-01-05 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium

Publications (2)

Publication Number Publication Date
CN115793715A CN115793715A (en) 2023-03-14
CN115793715B true CN115793715B (en) 2023-04-28

Family

ID=85428528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310010070.4A Active CN115793715B (en) 2023-01-05 2023-01-05 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium

Country Status (1)

Country Link
CN (1) CN115793715B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117173935B (en) * 2023-07-18 2024-02-09 北京锐士装备科技有限公司 Authentication method and system for providing authentication service for unmanned aerial vehicle

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1189231A (en) * 1995-05-30 1998-07-29 小维克多J·诺里斯 System for enhancing navigation and surveillance in low visibility conditions
CN107111319A (en) * 2015-12-25 2017-08-29 深圳市大疆创新科技有限公司 Unmanned plane during flying prompt system and method, control terminal, flight system
CN108351652A (en) * 2017-12-26 2018-07-31 深圳市道通智能航空技术有限公司 Unmanned vehicle paths planning method, device and flight management method, apparatus
CN108496134A (en) * 2017-05-31 2018-09-04 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage paths planning method and device
CN108776492A (en) * 2018-06-27 2018-11-09 电子科技大学 A kind of four-axle aircraft automatic obstacle avoiding and air navigation aid based on binocular camera
CN109073405A (en) * 2017-02-28 2018-12-21 深圳市大疆创新科技有限公司 Method and apparatus for integrated mapping data
CN109931950A (en) * 2018-09-05 2019-06-25 浙江科比特科技有限公司 A kind of real scene navigation method, system and terminal device
CN112349149A (en) * 2020-11-05 2021-02-09 中国联合网络通信集团有限公司 Internet unmanned aerial vehicle monitoring method, client, internet unmanned aerial vehicle and monitoring platform
WO2021141666A2 (en) * 2019-11-13 2021-07-15 Battelle Energy Alliance, Llc Unmanned vehicle navigation, and associated methods, systems, and computer-readable medium
CN113741490A (en) * 2020-05-29 2021-12-03 广州极飞科技股份有限公司 Inspection method, inspection device, aircraft and storage medium
CN114995519A (en) * 2022-07-29 2022-09-02 江苏复泽智能科技有限公司 Unmanned aerial vehicle AI landing method and system based on multi-obstacle scene
CN115081195A (en) * 2022-06-06 2022-09-20 北京易航远智科技有限公司 Laser radar simulation method and device, electronic equipment and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1189231A (en) * 1995-05-30 1998-07-29 小维克多J·诺里斯 System for enhancing navigation and surveillance in low visibility conditions
CN107111319A (en) * 2015-12-25 2017-08-29 深圳市大疆创新科技有限公司 Unmanned plane during flying prompt system and method, control terminal, flight system
CN109073405A (en) * 2017-02-28 2018-12-21 深圳市大疆创新科技有限公司 Method and apparatus for integrated mapping data
CN108496134A (en) * 2017-05-31 2018-09-04 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage paths planning method and device
CN108351652A (en) * 2017-12-26 2018-07-31 深圳市道通智能航空技术有限公司 Unmanned vehicle paths planning method, device and flight management method, apparatus
CN108776492A (en) * 2018-06-27 2018-11-09 电子科技大学 A kind of four-axle aircraft automatic obstacle avoiding and air navigation aid based on binocular camera
CN109931950A (en) * 2018-09-05 2019-06-25 浙江科比特科技有限公司 A kind of real scene navigation method, system and terminal device
WO2021141666A2 (en) * 2019-11-13 2021-07-15 Battelle Energy Alliance, Llc Unmanned vehicle navigation, and associated methods, systems, and computer-readable medium
CN113741490A (en) * 2020-05-29 2021-12-03 广州极飞科技股份有限公司 Inspection method, inspection device, aircraft and storage medium
CN112349149A (en) * 2020-11-05 2021-02-09 中国联合网络通信集团有限公司 Internet unmanned aerial vehicle monitoring method, client, internet unmanned aerial vehicle and monitoring platform
CN115081195A (en) * 2022-06-06 2022-09-20 北京易航远智科技有限公司 Laser radar simulation method and device, electronic equipment and storage medium
CN114995519A (en) * 2022-07-29 2022-09-02 江苏复泽智能科技有限公司 Unmanned aerial vehicle AI landing method and system based on multi-obstacle scene

Also Published As

Publication number Publication date
CN115793715A (en) 2023-03-14

Similar Documents

Publication Publication Date Title
EP3505869B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
JP7371157B2 (en) Vehicle monitoring method, device, electronic device, storage medium, computer program, cloud control platform and roadway coordination system
CN109472483B (en) Building site on-site modeling method and system based on BIM (building information modeling) model and aerial photography technology
CN111108538B (en) System for generating and/or updating digital models of digital maps
CN115793715B (en) Unmanned aerial vehicle auxiliary flight method, system, device and storage medium
CN111402387A (en) Removing short timepoints from a point cloud of a high definition map for navigating an autonomous vehicle
CN108255932B (en) Roaming browsing method and system of digital factory based on three-dimensional digital platform
CN114445565A (en) Data processing method and device, electronic equipment and computer readable medium
WO2020154670A1 (en) Vehicle routing with local and general routes
CN112598668B (en) Defect identification method and device based on three-dimensional image and electronic equipment
CN110782774A (en) Crowdsourcing road data distributed processing method and device
CN113189989B (en) Vehicle intention prediction method, device, equipment and storage medium
KR102012361B1 (en) Method and apparatus for providing digital moving map service for safe navigation of unmanned aerial vehicle
CN116583891A (en) Critical scene identification for vehicle verification and validation
CN115468578B (en) Path planning method and device, electronic equipment and computer readable medium
CN115790621A (en) High-precision map updating method and device and electronic equipment
CN113119999B (en) Method, device, equipment, medium and program product for determining automatic driving characteristics
CN114842207A (en) Road network generation method and device, readable storage medium and electronic equipment
JP7232727B2 (en) Map data management device and map data management method
CN114779705A (en) Method, device, electronic equipment and system for controlling automatic driving vehicle
KR102012362B1 (en) Method and apparatus for generating digital moving map for safe navigation of unmanned aerial vehicle
US20160085427A1 (en) System and method of sharing spatial data
CN115294234B (en) Image generation method and device, electronic equipment and storage medium
CN112859109B (en) Unmanned aerial vehicle panoramic image processing method and device and electronic equipment
WO2024036984A1 (en) Target localization method and related system, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant