CN113238571A - Unmanned aerial vehicle monitoring system, method, device and storage medium - Google Patents

Unmanned aerial vehicle monitoring system, method, device and storage medium Download PDF

Info

Publication number
CN113238571A
CN113238571A CN202110583594.3A CN202110583594A CN113238571A CN 113238571 A CN113238571 A CN 113238571A CN 202110583594 A CN202110583594 A CN 202110583594A CN 113238571 A CN113238571 A CN 113238571A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
flight
drone
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110583594.3A
Other languages
Chinese (zh)
Inventor
李腾腾
孙毅
张邦彦
景华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202110583594.3A priority Critical patent/CN113238571A/en
Publication of CN113238571A publication Critical patent/CN113238571A/en
Priority to PCT/CN2022/086171 priority patent/WO2022247498A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Abstract

The present specification discloses an unmanned aerial vehicle monitoring system, a method, an apparatus and a storage medium, in the system, an unmanned aerial vehicle sends current position information of the unmanned aerial vehicle to a server according to a time interval, the current position information is forwarded to a terminal by the server, the terminal acquires route information of the unmanned aerial vehicle from the server, a planned flight path of the unmanned aerial vehicle is displayed in a pre-established three-dimensional environment model, and the current position of the unmanned aerial vehicle is displayed in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle. Through show this unmanned aerial vehicle's planning flight path and real-time position in the three-dimensional environmental model who constructs in advance, more be favorable to observing the driftage that unmanned aerial vehicle produced in the direction of height, make the observation more comprehensive.

Description

Unmanned aerial vehicle monitoring system, method, device and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle monitoring system, a method, a device and a storage medium.
Background
With the development of unmanned technology, unmanned equipment such as unmanned aerial vehicles are also widely applied to various business fields. However, because the current unmanned aerial vehicle flight technology is not mature enough, in order to reduce the flight risk of the unmanned aerial vehicle, the mode of the flight track of the unmanned aerial vehicle can be monitored in real time through manual work, and whether the unmanned aerial vehicle flies according to the planned path or not is judged manually.
Currently, when monitoring an unmanned aerial vehicle, a planned path for the unmanned aerial vehicle to execute a current task is usually displayed in a pre-constructed map. And then, adjusting the position of the unmanned aerial vehicle in the map in real time according to the position information of the unmanned aerial vehicle so as to observe whether the unmanned aerial vehicle deviates from the planned path.
However, the planned path displayed in the map and the position of the unmanned aerial vehicle in the prior art do not contain height information, and whether the unmanned aerial vehicle deviates from the planned path cannot be intuitively judged.
Disclosure of Invention
The embodiment of the specification provides an unmanned aerial vehicle monitoring system, a method, a device and a storage medium, which are used for partially solving the problems in the prior art.
The embodiment of the specification adopts the following technical scheme:
this specification provides a unmanned aerial vehicle monitored control system, the system contains unmanned aerial vehicle, terminal and server, wherein:
the unmanned aerial vehicle is configured to send current position information of the unmanned aerial vehicle to the server according to a time interval;
the terminal bears a pre-constructed three-dimensional environment model, and the three-dimensional environment model is constructed based on environment information in the unmanned aerial vehicle flight area; the system is configured to send a route obtaining request to the server, and a planned flight path of the unmanned aerial vehicle is displayed in the three-dimensional environment model according to received route information; when receiving the current position information of the unmanned aerial vehicle, displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle;
the server is configured to forward the received current position information of the unmanned aerial vehicle to the terminal; and returning the air route information of the unmanned aerial vehicle to the terminal according to the received air route acquisition request.
Optionally, the terminal is further configured to display a schedule expected that the unmanned aerial vehicle has flown currently in the planned flight path according to planned flight time included in the received airline information.
Optionally, the pre-constructed three-dimensional environment model takes the geocenter as an origin;
the terminal is further configured to determine a reference point corresponding to a flight area according to the flight area of the unmanned aerial vehicle executing the current task;
and updating the position information of each position in the three-dimensional environment model by taking the reference point as the origin again.
Optionally, the server is further configured to receive, when there are a plurality of unmanned aerial vehicles, an unmanned aerial vehicle position acquisition request carrying an unmanned aerial vehicle identifier sent by the terminal, and forward current position information of the unmanned aerial vehicle corresponding to the unmanned aerial vehicle identifier to the terminal according to the unmanned aerial vehicle identifier;
according to the unmanned aerial vehicle identification contained in the received air route acquisition request, determining the air route information of the unmanned aerial vehicle corresponding to the unmanned aerial vehicle identification, and returning the air route information of the unmanned aerial vehicle to the terminal.
Optionally, the unmanned aerial vehicle is further configured to send current state information of the unmanned aerial vehicle to the server according to a time interval, where the current state information includes flight parameters of the unmanned aerial vehicle;
the server is further configured to forward the received current state information of the unmanned aerial vehicle to the terminal;
the terminal is further configured to display the current flight parameters of the unmanned aerial vehicle according to the received current state information of the unmanned aerial vehicle.
Optionally, the terminal is further configured to display the attitude of the drone in the three-dimensional environment model according to the received flight parameters in the current state information of the drone;
wherein the flight parameters at least include a flight pose of the drone.
Optionally, the server is further configured to determine, according to the flight mission to be executed by the unmanned aerial vehicle, a flight area where the unmanned aerial vehicle executes each flight mission;
aiming at each flight task to be executed of the unmanned aerial vehicle, acquiring each environment image in a flight area corresponding to the flight task, and constructing a three-dimensional environment model of the flight area according to the acquired environment images
Optionally, the server is further configured to receive an environment model acquisition request sent by the terminal, and send the constructed three-dimensional environment model to the corresponding terminal according to the terminal identifier carried in the environment model acquisition request.
Optionally, the terminal is further configured to determine, according to a flight task to be executed by the unmanned aerial vehicle, a flight area where the unmanned aerial vehicle executes each flight task;
and aiming at each flight task to be executed of the unmanned aerial vehicle, acquiring each environment image in a flight area corresponding to the flight task, and constructing a three-dimensional environment model of the flight area according to the acquired environment images.
Optionally, the terminal is further configured to send the built three-dimensional environment model to the server;
and the server is also configured to store the received three-dimensional environment model and send the constructed three-dimensional environment model to other terminals when receiving environment model acquisition requests sent by other terminals.
This specification provides an unmanned aerial vehicle monitoring method, includes:
the method comprises the steps that a terminal sends a route obtaining request to a server, and a planned flight path of the unmanned aerial vehicle is displayed in a pre-constructed three-dimensional environment model according to received route information, wherein the terminal bears the pre-constructed three-dimensional environment model, and the three-dimensional environment model is constructed on the basis of environment information of a flight area of the unmanned aerial vehicle;
and when the current position information of the unmanned aerial vehicle is received, displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle.
This specification provides an unmanned aerial vehicle monitoring device, unmanned aerial vehicle monitoring device bears the weight of the three-dimensional environmental model who constructs in advance, three-dimensional environmental model constructs based on unmanned aerial vehicle flight area's environmental information, includes:
the route request module is used for sending a route acquisition request to a server and displaying a planned flight path of the unmanned aerial vehicle in the three-dimensional environment model according to received route information;
and the position display module is used for displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle when receiving the current position information of the unmanned aerial vehicle.
The present specification provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the method for monitoring a drone is implemented.
The embodiment of the specification adopts at least one technical scheme which can achieve the following beneficial effects:
in this specification, contain unmanned aerial vehicle in this unmanned aerial vehicle monitored control system, server and terminal, unmanned aerial vehicle sends the current position information of self to the server according to time interval in this system to forward for the terminal by this server, this terminal acquires this unmanned aerial vehicle's airline information from this server, shows this unmanned aerial vehicle's planning flight path in the three-dimensional environmental model that builds in advance, and according to this unmanned aerial vehicle's current position information, show this unmanned aerial vehicle's current position in this three-dimensional environmental model. Through show this unmanned aerial vehicle's planning flight path and real-time position in the three-dimensional environmental model who constructs in advance, more be favorable to observing the driftage that unmanned aerial vehicle produced in the direction of height, make the observation more comprehensive.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic view of a picture for monitoring an unmanned aerial vehicle in the prior art;
fig. 2 is a schematic view of a monitoring system for an unmanned aerial vehicle provided in an embodiment of the present description;
fig. 3 is a schematic view of a screen for monitoring an unmanned aerial vehicle according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a planned flight path display provided in an embodiment of the present disclosure;
fig. 5 is a schematic diagram of three-party interaction in a monitoring system of an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of a method for monitoring an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an unmanned aerial vehicle monitoring device provided in an embodiment of this specification.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more apparent, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person skilled in the art without making any inventive step based on the embodiments in the description belong to the protection scope of the present application.
Fig. 1 is a schematic view of a picture for monitoring an unmanned aerial vehicle in the prior art, where a two-dimensional plan view is a pre-constructed map, where each building is represented by a plane graph and the position of each building is displayed, a dotted line segment in the map represents a planned path for the unmanned aerial vehicle to execute a current task, the unmanned aerial vehicle can fly from a starting point position to an end point position along the planned path, and the position of an icon of the unmanned aerial vehicle in the map represents the position of the current unmanned aerial vehicle. As can be seen from the figure, the information of the planned path in the horizontal direction is displayed in the monitoring picture, and the position change of the unmanned aerial vehicle in the horizontal direction can be observed.
However, considering that the actual flight of the unmanned aerial vehicle is in a three-dimensional space, the height information is also an important factor for judging whether the unmanned aerial vehicle is yawing, the change situation of the flight height of the unmanned aerial vehicle is not observed through the pictures, and the height information of the planned path is not displayed, so that whether the unmanned aerial vehicle is yawing in the vertical dimension is difficult to judge. Based on this, this specification provides an unmanned aerial vehicle monitoring method, can observe unmanned aerial vehicle's flight state comprehensively to and whether yaw appears in the demonstration unmanned aerial vehicle directly perceived.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 2 is a schematic view of a monitoring system for a drone provided in an embodiment of the present disclosure, the system includes a drone 100, a server 102 and a terminal 104, the server 102 is a background server 102 for controlling the drone 100 to fly, and information can be transmitted between the server 102 and the drone 100 through a wireless communication technology, for example, the drone 100 can transmit information such as a real-time position of the drone and an acquired image to the server 102, and the server 102 can perform flight path planning according to the real-time position of the drone 100 and transmit a flight control instruction to the drone 100. The terminal 104 and the server 102 may transmit information in a wired or wireless manner, and the terminal 104 may obtain the real-time position where the drone 100 flies and the airline information of the drone 100 from the server 102 to display the real-time flight status of the drone 100.
The server 102 may be a single server 102, or may be a cluster formed by a plurality of servers 102, for example, a distributed server 102 system, or the like, may be a physical server 102 device, or may be a cloud server 102, which is not limited in this specification, and may be specifically set according to needs. This terminal 104 can be at least one in smart mobile phone, panel computer and desktop computer etc. electronic equipment to, install and run in this terminal 104 and have the application that supports to show unmanned aerial vehicle 100 flight state, can show information such as the real-time position change of the flight path of planning in advance and this unmanned aerial vehicle 100. The number of the terminals 104 may be one or more, and the description is not limited, and may be specifically set as required.
In order to observe the flight state of the unmanned aerial vehicle 100 in the task execution process more intuitively, the description provides an unmanned aerial vehicle monitoring system, and the change of the unmanned aerial vehicle 100 in the flight altitude direction is additionally displayed through displaying the planned path of the unmanned aerial vehicle 100 and the change of the real-time flight position of the unmanned aerial vehicle 100 in the three-dimensional space, so that the flight state of the unmanned aerial vehicle 100 is comprehensively displayed.
Specifically, in the unmanned aerial vehicle monitoring system, when the unmanned aerial vehicle 100 starts to execute a flight mission, the unmanned aerial vehicle 100 may periodically send current location information of itself to the server 102 according to a preset time interval, and the current location information is forwarded to the terminal 104 by the server 102 so as to be displayed in the terminal 104. Wherein, this preset time interval can set up as required, if, set up to 1s, then unmanned aerial vehicle 100 can send one self position to server 102 every second.
It should be noted that, when transmitting the current location information to the server 102, the drone 100 may transmit the current location information after being queried by the server 102, that is, after the server 102 transmits a location information acquisition request to the drone 100, the drone 100 transmits its own current location information to the server 102. Or, the unmanned aerial vehicle 100 can also send the information autonomously, and this specification does not limit this, and specifically, the information can be set as required.
After receiving the current location information sent by the drone 100, the server 102 may forward the current location information of the drone 100 to the terminal 104, so that the terminal 104 displays the current location of the drone 100 in a pre-constructed three-dimensional environment model according to the current location information of the drone 100. When the current location information of the drone 100 is sent to the terminal 104, the server 102 may send the current location information of the drone 100 after receiving the query of the terminal 104, that is, after the terminal 104 sends the request for obtaining the location of the drone 100 to the server 102, the server 102 forwards the current location of the drone 100 to the terminal 104 according to the request for obtaining the location of the drone 100. Alternatively, the server 102 may also store a response program for sending the location information to the terminal 104 in advance, and when a program trigger condition is satisfied, the server 102 may autonomously push the current location information of the drone 100 to the terminal 104. The trigger condition includes, but is not limited to, the server 102 receiving the current location information sent by the drone 100.
In the unmanned aerial vehicle monitoring system of this specification, this terminal 104 not only is used for showing the position change of this unmanned aerial vehicle 100 in real time, still can be based on this unmanned aerial vehicle 100 carries out this flight path information of mission, shows the flight path who plans for this unmanned aerial vehicle 100 in advance. In addition, according to the deviation between the real-time position of the unmanned aerial vehicle 100 and the planned flight path, the yaw degree of the unmanned aerial vehicle 100 in the horizontal direction and the vertical direction can be visually observed.
Therefore, in this specification, when the terminal 104 first receives the current location information of the drone 100 forwarded by the server 102, indicating that the drone 100 starts to perform a flight mission, it may start to monitor the flight status of the drone 100. The terminal 104 may then send a route acquisition request to the server 102 to acquire route information for the drone 100 to perform the current flight mission for subsequent path display.
Moreover, after the terminal 104 receives the current position information of the drone 100, the current position of the drone 100 may be displayed in the pre-constructed three-dimensional environment model according to the current position information. Wherein, when showing the current position of this unmanned aerial vehicle 100, can show the relative position in this three-dimensional environment model of this unmanned aerial vehicle 100 on the one hand, on the other hand also can show the three-dimensional coordinate of this unmanned aerial vehicle 100's current position, contains information such as longitude, latitude, altitude and height to the ground. As shown in fig. 3, the three-dimensional background map containing the environmental information in fig. 3 is a three-dimensional environment model of the flight area, and the model of the drone is exemplarily shown in the diagram at a start position, and the drone is uniquely identified by using "SIM-batch-048" as the drone identifier. The information of the drone 100 at the lower right corner in the figure includes information such as the longitude, latitude, altitude, and ground height of the current position information of the drone 100.
The three-dimensional environment model may be constructed by the server 102 in advance according to a flight mission to be executed by the drone 100, and there are various methods for constructing the three-dimensional environment model, which are not limited in this specification. In one possible embodiment, the server 102 may determine the flight area of the drone 100 for each flight mission to be performed based on each flight mission to be performed by the drone 100. And then, acquiring each environment image in the flight area aiming at each flight task to be executed, and constructing a three-dimensional environment model of the flight area according to the acquired environment images.
When multiple terminals 104 are used for monitoring the flight condition of the unmanned aerial vehicle, each terminal 104 may send an environment model acquisition request to the server 102, and the server 102 may issue a constructed three-dimensional environment model to a corresponding terminal 104 according to a terminal identifier carried in the environment model acquisition request sent by each terminal 104.
Or in another embodiment, the terminal 104 may also acquire an environment image of each flight area where the drone 100 flies, and construct a three-dimensional environment model of each flight area according to the acquired environment image. And then, sending the constructed three-dimensional environment model to the server 102 for storage, so that when the server 102 receives an environment model acquisition request sent by another terminal 104, the constructed three-dimensional environment model is sent to the corresponding other terminal 104 according to the terminal identifier carried in the environment model acquisition request.
When acquiring the environment image in the flight area, the environment image in the flight area may be acquired from a plurality of angular directions by a flight device such as an airplane or an unmanned aerial vehicle 100 equipped with a plurality of image sensors, respectively, by oblique photogrammetry. In addition, when the three-dimensional environment model of the flight area is constructed, modeling software mature in the prior art, such as Smart 3D, PIX4D, can be directly adopted for modeling. Since it is a common prior art to construct a three-dimensional environment model based on oblique photogrammetry, the three-dimensional environment model for specifically constructing a flight area is not described in detail in this specification, and reference may be made to specific embodiments in the prior art.
Further, in order to truly monitor the flight condition of the unmanned aerial vehicle 100 in the real space, when constructing the three-dimensional environment model in the flight area, the real three-dimensional environment model may be constructed based on the World Geodetic System-1984 Coordinate System (WGS-84) in 1984, that is, with the earth centroid as the Coordinate origin. The shape and size of each environment object in the three-dimensional environment model are the same as the shape and size of the real environment object, and the position of each environment object in the three-dimensional environment model is also the same as the position of each environment object in the real world. Of course, since the terminal 104 mainly shows the relative position relationship between the current position of the unmanned aerial vehicle 100 and each environmental object in the flight area, each environmental object in the three-dimensional environment model may also be scaled down or enlarged, which is not limited in this specification and may be set as needed.
Further, in order to more truly show the flight state of the drone 100, a model of the drone 100 of the real drone 100 may be further constructed in the present specification, and the change in the position of the drone 100 is shown in the three-dimensional environment model through the model of the drone 100. Wherein, the shape, size and other parameters of the model of the drone 100 remain the same as those of the real drone 100. Of course, since the terminal 104 mainly shows the relative position relationship between the drone 100 and each environmental object in the flight area, the size of the model of the drone 100 may also be reduced or enlarged in an equal ratio to the environmental object.
In another embodiment of the present disclosure, in order to reduce the amount of calculation required for model building, instead of modeling the drone 100 or each environmental object in the flight area, a graph or a plurality of combined graphs having a shape similar to that of the drone 100 and each environmental object may be used instead, for example, for a tall building in the environment, the graph may be directly displayed as a cube in the three-dimensional environmental model.
In this specification, after the server 102 receives the airline acquisition request sent by the terminal 104, it may determine, from the pre-stored airline information, airline information for the unmanned aerial vehicle 100 to execute the current task according to the airline acquisition request, and return the airline information to the terminal 104. The airline acquisition request at least includes at least one of the drone identifier of the drone 100 and the task identifier of the current task executed by the drone 100. When the route acquisition request includes the drone identifier of the drone 100, the server 102 may determine, according to the drone identifier, route information for the drone 100 corresponding to the drone identifier to execute the current task from prestored route information corresponding to each drone 100. When the route acquisition request includes a task identifier for the unmanned aerial vehicle 100 to execute the current task, route information of the current task corresponding to the task identifier may be determined from route information corresponding to each task stored in advance.
Moreover, the route information at least includes a planned flight path for the drone 100 to execute the current mission and a planned flight time for the drone 100 to execute the current mission, where the planned flight time at least includes a total flight time for the drone 100 to execute the current mission, and of course, may also include a time for the drone 100 to reach each position in the planned flight path or a flight schedule of the drone 100 at each flight time node.
Then, the terminal 104 may display the planned flight path of the whole drone 100 in the pre-constructed three-dimensional environment model according to the received route information. As shown in fig. 3, a gray line segment represents a route of the unmanned aerial vehicle 100 flying at this time, that is, a planned flight path, and the altitude information of the planned flight path is visually shown.
Finally, since the position of the drone 100 changes in real time, the drone 100 sends its current position information to the server 102 in real time, and the server 102 forwards the current position information to the terminal 104, so that the terminal 104 can adjust the position of the drone 100 model in the three-dimensional environment model according to the latest received current position information of the drone 100.
Further, in order to track the flight trajectory of the drone 100 in real time, the terminal 104 may also display the progress expected to be flown by the drone 100 currently in the planned flight path according to the planned flight time of the drone 100 to each position, which is included in the route information. For example, the route that the drone 100 is currently flying on may be displayed in a prominent manner (e.g., bolded or changing colors) in the original planned flight path, as planned for flight. As shown in fig. 4, fig. 4 exemplarily shows a planned flight path of the drone 100 from a starting point to an ending point, and only a building exemplarily represents a pre-constructed three-dimensional environment model, then during the flight of the drone 100, the terminal 104 may display the progress of the drone 100 that is expected to be currently flown in a bold manner in the planned flight path according to the received route information. The bold route in fig. 4 represents the progress of the unmanned aerial vehicle expected to have flown at present, and the position a corresponding to the bold route represents the position where the unmanned aerial vehicle is expected to arrive at present. The position B where the unmanned aerial vehicle icon is located in the figure is the current position of the unmanned aerial vehicle in the actual flight process. And the terminal 104 also adjusts the position of the model of the drone 100 in the three-dimensional environment model in real time according to the current position information of the drone 100 forwarded by the server 102. The deviation between the current flight schedule and the desired flight schedule of the drone 100 may also be visually displayed in this fig. 4.
In another embodiment of the present specification, the drone 100 may further send current state information of the drone 100 to the server 102 at time intervals, where the current state information includes flight parameters of the drone 100, such as parameters of a flying speed, a flying state (whether flying or stationary), a flying heading, and a flying attitude (a pitch angle, a yaw angle, and a roll angle), and the server 102 forwards the current state information of the drone 100 to the terminal 104, so that the terminal 104 displays the current flight state parameters of the drone 100 according to the current state information of the drone 100. As shown in fig. 3, information such as the current flying speed and the flying heading of the drone 100 is also displayed in the information of the drone 100 at the lower right corner of fig. 3.
Of course, the terminal 104 may also display the attitude of the model of the drone corresponding to the drone 100, such as a pitch angle, a yaw angle, a roll angle, and the like, in the three-dimensional environment model according to the flight parameters in the current state information of the drone 100. Moreover, since the state information of the drone 100 changes in real time during the flight process, the terminal 104 may also adjust the attitude of the drone model in the three-dimensional environment model in real time according to the latest received flight parameters.
In another embodiment, the terminal 104 may also determine whether the drone 100 starts flying according to the flight status in the received current status information of the drone 100, and when the drone 100 is in the flight status, send an airline acquisition request to the server 102 to acquire airline information for the drone 100 to execute the current task and perform path display. Alternatively, the server 102 may also store a response program for sending the location information to the terminal 104 in advance, and when a program trigger condition is satisfied, the server 102 may autonomously push the current location information of the drone 100 to the terminal 104. The triggering condition includes, but is not limited to, that the drone starts flying, i.e., the flying state in the current state information changes from stationary to flying.
In addition, in the present specification, since the WGS-84 coordinate system is used when the three-dimensional environment model of the flight area is constructed in advance, and the precision of the display of each environment object in the coordinate system is low, the terminal 104 may convert the coordinate system to convert the three-dimensional environment model into a local coordinate system with higher precision for the convenience of intuitive display.
Specifically, the terminal 104 may determine a reference point preset in the flight area, that is, a reference point corresponding to the flight area, according to the flight area where the unmanned aerial vehicle 100 executes the current task, and update the position information of each position in the three-dimensional environment model by taking the reference point as the origin of the local coordinate system again, so as to convert the three-dimensional environment model into the local coordinate system for display.
Based on the unmanned aerial vehicle monitored control system that fig. 2 shows, contain unmanned aerial vehicle in this system, server and terminal, unmanned aerial vehicle sends self current position information to the server according to time interval in this system to forward by this server to the terminal, this terminal acquires this unmanned aerial vehicle's airline information from this server, shows this unmanned aerial vehicle's planning flight path in the three-dimensional environmental model who constructs in advance, and according to this unmanned aerial vehicle's current position information, show this unmanned aerial vehicle's current position in this three-dimensional environmental model. Through show this unmanned aerial vehicle's planning flight path and real-time position in the three-dimensional environmental model who constructs in advance, more be favorable to observing the driftage that unmanned aerial vehicle produced in the direction of height, make the observation more comprehensive.
To sum up, in order to more intuitively embody the interaction situation between each device in the unmanned aerial vehicle monitoring system, this specification also provides the three-party interaction schematic diagram in the unmanned aerial vehicle monitoring system, as shown in fig. 5. The drone 100 may send its current location information to the server 102 at intervals and forwarded by the server 102 to the terminal 104. The terminal 104 may send an airline acquisition request to the server 102 to acquire airline information from the server 102, and display a planned flight path of the unmanned aerial vehicle 100 in a pre-established three-dimensional environment model according to the airline information, and the terminal 104 displays a current position of the unmanned aerial vehicle 100 in the three-dimensional environment model in real time according to the received current position information of the unmanned aerial vehicle 100.
In this specification, the flight status of multiple drones 100 may also be monitored at the same time, and then when the terminal 104 acquires the position information of the drone 100, the terminal may send a drone 100 position acquisition request carrying the drone identifier of the drone 100 to the server 102, so that the server 102 determines the current position information of the drone 100 corresponding to the drone identifier according to the drone identifier included in the drone 100 position acquisition request, and returns the current position information to the terminal 104.
Moreover, when the terminal 104 acquires the route information of the current task executed by the drone 100, the drone identifier of the drone 100 is also added to the route acquisition request, so that the server 102 determines the route information of the drone 100 corresponding to the drone identifier according to the drone identifier carried in the route acquisition request, and returns the route information to the terminal 104.
Note that the flight mission performed by the drone 100 may be a distribution mission, an observation mission, and the like. When the drone 100 executes the distribution task, the planned flight path of the drone 100 is the flight path of the drone 100 from the distribution start point to the distribution end point. The flight state of the drone 100 during the execution of the delivery task can be monitored by the drone monitoring system in this specification.
In one or more embodiments of the present disclosure, the route information of the drone 100 and the current location information of the drone 100 may also be obtained from different servers 102, and the server 102 corresponding to the current location information of the drone 100 is taken as a first server 102, and the server 102 corresponding to the route information of the drone 100 is taken as a second server 102 for example, so as to describe in this description, the drone 100 may send the current location information of itself to the first server 102 according to a time interval, and the first server 102 forwards the current location information to the terminal 104. When the terminal 104 receives the current location information of the drone 100 forwarded from the first server 102 for the first time, it may send an airline acquisition request to the second server 102 to acquire airline information for the drone 100 to perform the current task.
For the unmanned aerial vehicle monitoring system shown in fig. 1, the present specification also correspondingly provides an unmanned aerial vehicle monitoring method adopted in the unmanned aerial vehicle monitoring system, as shown in fig. 6.
Fig. 6 is a schematic flow chart of a method for monitoring an unmanned aerial vehicle provided in an embodiment of the present specification, where the method for monitoring an unmanned aerial vehicle may be used in an unmanned aerial vehicle monitoring system, and specifically may include the following steps:
s200: and the terminal sends a route acquisition request to the server, and displays the planned flight path of the unmanned aerial vehicle in a pre-constructed three-dimensional environment model according to the received route information.
In this specification, this unmanned aerial vehicle monitored control system contains unmanned aerial vehicle, server and terminal. Wherein, this server is used for controlling unmanned aerial vehicle's flight to receive this unmanned aerial vehicle's positional information in real time. The terminal is used for displaying the planned flight path and the real-time flight state of the unmanned aerial vehicle. The unmanned aerial vehicle monitoring method provided by the specification can be executed by any equipment in the unmanned aerial vehicle monitoring system, and for convenience of description, the specification takes a terminal as an execution main body for example.
Specifically, when the unmanned aerial vehicle is monitored, the terminal can send a route acquisition request to the server so as to acquire route information of the unmanned aerial vehicle, and the planned flight path of the unmanned aerial vehicle is displayed in a pre-constructed three-dimensional environment model according to the acquired route information. And the terminal can also display the current flying progress of the expected unmanned aerial vehicle in the planned flying path according to the planned flying time (the flying time for reaching each position) contained in the airline information.
The method for constructing the three-dimensional environment model is explained on the system side, and is not described herein again.
S202: and when the current position information of the unmanned aerial vehicle is received, displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle.
When monitoring unmanned aerial vehicle's flight condition in this specification, this terminal can show in this three-dimensional environment model according to this unmanned aerial vehicle's real-time position change to whether driftage appears in this unmanned aerial vehicle of visual observation.
Specifically, when the terminal receives the current position information of the unmanned aerial vehicle forwarded by the server, the current position of the unmanned aerial vehicle can be displayed in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle. And, because the position changes in real time in this unmanned aerial vehicle flight process, therefore this terminal can be in real time according to the latest position of this unmanned aerial vehicle of receipt, adjusts the position of this unmanned aerial vehicle in this three-dimensional environment model.
Based on the unmanned aerial vehicle monitoring method shown in fig. 6, the terminal can acquire the flight route information of the unmanned aerial vehicle from the server, display the planned flight path of the unmanned aerial vehicle in the pre-established three-dimensional environment model according to the acquired flight route information, and display the current position of the unmanned aerial vehicle in the three-dimensional environment model when receiving the current position information of the unmanned aerial vehicle. Through show this unmanned aerial vehicle's planning flight path and real-time position change in three-dimensional environmental model, more be favorable to observing the driftage that unmanned aerial vehicle produced in the direction of height, make the observation more comprehensive.
In addition, the unmanned aerial vehicle monitoring method provided by the specification can also be applied to monitoring the flight state of the unmanned aerial vehicle in the process of executing the distribution task. The rest detailed processes for monitoring the unmanned aerial vehicle can specifically refer to the detailed description in the unmanned aerial vehicle monitoring system provided by the specification, and since the detailed processes for monitoring the unmanned aerial vehicle are already described in the foregoing, reference can be made to the foregoing, and the description of the detailed processes is not repeated here.
Based on the unmanned aerial vehicle monitoring method shown in fig. 6, the embodiment of the present specification further provides a schematic structural diagram of an unmanned aerial vehicle monitoring apparatus, as shown in fig. 7.
Fig. 7 is a schematic structural diagram of an unmanned aerial vehicle monitoring device provided in an embodiment of this specification, where the unmanned aerial vehicle monitoring device bears a pre-constructed three-dimensional environment model, and the three-dimensional environment model is constructed based on environmental information of a flight area of the unmanned aerial vehicle, and the unmanned aerial vehicle monitoring device includes:
the route request module 300 is used for sending a route acquisition request to a server and displaying a planned flight path of the unmanned aerial vehicle in the three-dimensional environment model according to received route information;
and a position display module 302, configured to display the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle when receiving the current position information of the unmanned aerial vehicle.
The embodiment of the present specification further provides a computer-readable storage medium, where the storage medium stores a computer program, and the computer program may be used to execute the drone monitoring method as provided in fig. 6.
Of course, besides the software implementation, the present specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (13)

1. The utility model provides an unmanned aerial vehicle monitored control system which characterized in that, the system contains unmanned aerial vehicle, terminal and server, wherein:
the unmanned aerial vehicle is configured to send current position information of the unmanned aerial vehicle to the server according to a time interval;
the terminal bears a pre-constructed three-dimensional environment model, and the three-dimensional environment model is constructed based on environment information in the unmanned aerial vehicle flight area; the system is configured to send a route obtaining request to the server, and a planned flight path of the unmanned aerial vehicle is displayed in the three-dimensional environment model according to received route information; when receiving the current position information of the unmanned aerial vehicle, displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle;
the server is configured to forward the received current position information of the unmanned aerial vehicle to the terminal; and returning the air route information of the unmanned aerial vehicle to the terminal according to the received air route acquisition request.
2. The system of claim 1, wherein the terminal is further configured to display a schedule in the planned flight path that the drone is expected to have currently flown according to a planned flight time contained in the received airline information.
3. The system of claim 1, wherein the pre-constructed three-dimensional environmental model has a geocentric origin;
the terminal is further configured to determine a reference point corresponding to a flight area according to the flight area of the unmanned aerial vehicle executing the current task;
and updating the position information of each position in the three-dimensional environment model by taking the reference point as the origin again.
4. The system of claim 1, wherein the server is further configured to, when there are a plurality of drones, receive a drone position acquisition request carrying a drone identifier sent by the terminal, and forward current position information of a drone corresponding to the drone identifier to the terminal according to the drone identifier;
according to the unmanned aerial vehicle identification contained in the received air route acquisition request, determining the air route information of the unmanned aerial vehicle corresponding to the unmanned aerial vehicle identification, and returning the air route information of the unmanned aerial vehicle to the terminal.
5. The system of claim 1, wherein the drone is further configured to send current state information of itself to the server at time intervals, the current state information containing flight parameters of the drone;
the server is further configured to forward the received current state information of the unmanned aerial vehicle to the terminal;
the terminal is further configured to display the current flight parameters of the unmanned aerial vehicle according to the received current state information of the unmanned aerial vehicle.
6. The system of claim 5, wherein the terminal is further configured to display the pose of the drone in the three-dimensional environmental model according to the flight parameters received in the current state information of the drone;
wherein the flight parameters at least include a flight pose of the drone.
7. The system of claim 1, wherein the server is further configured to determine a flight area for the drone to perform each flight mission based on the flight mission to be performed by the drone;
and aiming at each flight task to be executed of the unmanned aerial vehicle, acquiring each environment image in a flight area corresponding to the flight task, and constructing a three-dimensional environment model of the flight area according to the acquired environment images.
8. The system of claim 7, wherein the server is further configured to receive an environment model acquisition request sent by a terminal, and send the constructed three-dimensional environment model to the corresponding terminal according to a terminal identifier carried in the environment model acquisition request.
9. The system of claim 1, wherein the terminal is further configured to determine a flight area for the drone to perform each flight mission based on the flight mission to be performed by the drone;
and aiming at each flight task to be executed of the unmanned aerial vehicle, acquiring each environment image in a flight area corresponding to the flight task, and constructing a three-dimensional environment model of the flight area according to the acquired environment images.
10. The system of claim 9, wherein the terminal is further configured to send the built three-dimensional environment model to the server;
and the server is also configured to store the received three-dimensional environment model and send the constructed three-dimensional environment model to other terminals when receiving environment model acquisition requests sent by other terminals.
11. An unmanned aerial vehicle monitoring method is characterized by comprising the following steps:
the method comprises the steps that a terminal sends a route obtaining request to a server, and a planned flight path of the unmanned aerial vehicle is displayed in a pre-constructed three-dimensional environment model according to received route information, wherein the terminal bears the pre-constructed three-dimensional environment model, and the three-dimensional environment model is constructed on the basis of environment information of a flight area of the unmanned aerial vehicle;
and when the current position information of the unmanned aerial vehicle is received, displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle.
12. The utility model provides an unmanned aerial vehicle monitoring device, its characterized in that, unmanned aerial vehicle monitoring device bears the three-dimensional environmental model of founding in advance, three-dimensional environmental model founds based on unmanned aerial vehicle flight area's environmental information, includes:
the route request module is used for sending a route acquisition request to a server and displaying a planned flight path of the unmanned aerial vehicle in the three-dimensional environment model according to received route information;
and the position display module is used for displaying the current position of the unmanned aerial vehicle in the three-dimensional environment model according to the current position information of the unmanned aerial vehicle when receiving the current position information of the unmanned aerial vehicle.
13. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of claim 11.
CN202110583594.3A 2021-05-27 2021-05-27 Unmanned aerial vehicle monitoring system, method, device and storage medium Pending CN113238571A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110583594.3A CN113238571A (en) 2021-05-27 2021-05-27 Unmanned aerial vehicle monitoring system, method, device and storage medium
PCT/CN2022/086171 WO2022247498A1 (en) 2021-05-27 2022-04-11 Unmanned aerial vehicle monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110583594.3A CN113238571A (en) 2021-05-27 2021-05-27 Unmanned aerial vehicle monitoring system, method, device and storage medium

Publications (1)

Publication Number Publication Date
CN113238571A true CN113238571A (en) 2021-08-10

Family

ID=77139049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110583594.3A Pending CN113238571A (en) 2021-05-27 2021-05-27 Unmanned aerial vehicle monitoring system, method, device and storage medium

Country Status (2)

Country Link
CN (1) CN113238571A (en)
WO (1) WO2022247498A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536467A (en) * 2021-07-24 2021-10-22 深圳市北斗云信息技术有限公司 Unmanned aerial vehicle remote operation display system
CN113670275A (en) * 2021-08-13 2021-11-19 诚邦测绘信息科技(浙江)有限公司 Unmanned aerial vehicle surveying and mapping method, system and storage medium for ancient buildings
CN113791631A (en) * 2021-09-09 2021-12-14 常州希米智能科技有限公司 Unmanned aerial vehicle positioning flight control method and device based on Beidou
WO2022247498A1 (en) * 2021-05-27 2022-12-01 北京三快在线科技有限公司 Unmanned aerial vehicle monitoring

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115980742B (en) * 2023-03-20 2023-05-19 成都航空职业技术学院 Radar detection method and device for unmanned aerial vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739535A (en) * 2016-04-29 2016-07-06 竒葩网络(深圳)有限公司 Unmanned aerial vehicle flight control method, unmanned aerial vehicle flight control device and unmanned aerial vehicle flight control system
US20160292869A1 (en) * 2015-03-03 2016-10-06 PreNav, Inc. Scanning environments and tracking unmanned aerial vehicles
WO2018152849A1 (en) * 2017-02-27 2018-08-30 深圳市大疆创新科技有限公司 Control method, remote monitoring device, base station, server and steaming media server
CN109917813A (en) * 2019-04-19 2019-06-21 成都蔚来空间科技有限公司 Unmanned plane autonomous flight three-dimensional scenic display methods and terminal
CN110262545A (en) * 2019-05-30 2019-09-20 中国南方电网有限责任公司超高压输电公司天生桥局 A kind of unmanned plane during flying Three-Dimensional Path Planning Method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016210432A1 (en) * 2015-06-26 2016-12-29 Apollo Robotic Systems Incorporated Robotic apparatus, systems, and related methods
EP3195292B1 (en) * 2015-07-10 2019-01-09 SZ DJI Technology Co., Ltd. Systems and methods for gimbal simulation
EP3564621A4 (en) * 2016-12-28 2020-08-19 SZ DJI Technology Co., Ltd. Flight path display method, mobile platform, flight system, recording medium, and program
CN106991681B (en) * 2017-04-11 2020-01-14 福州大学 Method and system for extracting and visualizing fire boundary vector information in real time
CN109917814A (en) * 2019-04-19 2019-06-21 成都蔚来空间科技有限公司 Unmanned plane operational method and system
WO2021064982A1 (en) * 2019-10-04 2021-04-08 株式会社トラジェクトリー Information processing device and information processing method
CN112287056A (en) * 2020-11-04 2021-01-29 北京蒙泰华奥国际贸易有限公司 Navigation management visualization method and device, electronic equipment and storage medium
CN112652065A (en) * 2020-12-18 2021-04-13 湖南赛吉智慧城市建设管理有限公司 Three-dimensional community modeling method and device, computer equipment and storage medium
CN113238571A (en) * 2021-05-27 2021-08-10 北京三快在线科技有限公司 Unmanned aerial vehicle monitoring system, method, device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160292869A1 (en) * 2015-03-03 2016-10-06 PreNav, Inc. Scanning environments and tracking unmanned aerial vehicles
CN105739535A (en) * 2016-04-29 2016-07-06 竒葩网络(深圳)有限公司 Unmanned aerial vehicle flight control method, unmanned aerial vehicle flight control device and unmanned aerial vehicle flight control system
WO2018152849A1 (en) * 2017-02-27 2018-08-30 深圳市大疆创新科技有限公司 Control method, remote monitoring device, base station, server and steaming media server
CN109917813A (en) * 2019-04-19 2019-06-21 成都蔚来空间科技有限公司 Unmanned plane autonomous flight three-dimensional scenic display methods and terminal
CN110262545A (en) * 2019-05-30 2019-09-20 中国南方电网有限责任公司超高压输电公司天生桥局 A kind of unmanned plane during flying Three-Dimensional Path Planning Method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247498A1 (en) * 2021-05-27 2022-12-01 北京三快在线科技有限公司 Unmanned aerial vehicle monitoring
CN113536467A (en) * 2021-07-24 2021-10-22 深圳市北斗云信息技术有限公司 Unmanned aerial vehicle remote operation display system
CN113670275A (en) * 2021-08-13 2021-11-19 诚邦测绘信息科技(浙江)有限公司 Unmanned aerial vehicle surveying and mapping method, system and storage medium for ancient buildings
CN113670275B (en) * 2021-08-13 2024-01-02 诚邦测绘信息科技(浙江)有限公司 Unmanned aerial vehicle mapping method, system and storage medium for ancient building
CN113791631A (en) * 2021-09-09 2021-12-14 常州希米智能科技有限公司 Unmanned aerial vehicle positioning flight control method and device based on Beidou

Also Published As

Publication number Publication date
WO2022247498A1 (en) 2022-12-01

Similar Documents

Publication Publication Date Title
CN113238571A (en) Unmanned aerial vehicle monitoring system, method, device and storage medium
US20210141518A1 (en) Graphical user interface customization in a movable object environment
US10134298B2 (en) System and method for supporting simulated movement
CN110765620B (en) Aircraft visual simulation method, system, server and storage medium
CN107735737B (en) Waypoint editing method, device, equipment and aircraft
CN108628334B (en) Control method, device and system of unmanned aerial vehicle and unmanned aerial vehicle
CN108496130A (en) Flight control method, equipment, control terminal and its control method, unmanned plane
CN106406354A (en) Distributed aircraft formation method based on three-dimensional dynamic obstacle avoidance
CN104932527A (en) Aircraft control method and device
CN110570692B (en) Unmanned aerial vehicle air route detection method and device
JP6829513B1 (en) Position calculation method and information processing system
US10565783B2 (en) Federated system mission management
CN108268481A (en) High in the clouds map updating method and electronic equipment
CN112712558A (en) Positioning method and device of unmanned equipment
CN110362102B (en) Method, device and system for generating unmanned aerial vehicle route
Tisdale et al. The software architecture of the Berkeley UAV platform
KR20190068955A (en) Device for flight simulating of unmanned aerial vehicle, and system for flight simulating of unmanned aerial vehicle using thereof
US20210357620A1 (en) System, moving object, and information processing apparatus
CN114047760A (en) Path planning method and device, electronic equipment and automatic driving vehicle
CN115979262B (en) Positioning method, device and equipment of aircraft and storage medium
CN113515137A (en) Unmanned aerial vehicle control method and device, storage medium and electronic equipment
JP6661187B1 (en) Aircraft management server and management system
CN108270816A (en) High in the clouds map rejuvenation equipment
US10802685B2 (en) Vehicle marking in vehicle management systems
JPWO2021079516A1 (en) Flight route creation method and management server for aircraft

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210810