CN117873148A - Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium - Google Patents

Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN117873148A
CN117873148A CN202311699037.3A CN202311699037A CN117873148A CN 117873148 A CN117873148 A CN 117873148A CN 202311699037 A CN202311699037 A CN 202311699037A CN 117873148 A CN117873148 A CN 117873148A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
ground station
data
operation instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311699037.3A
Other languages
Chinese (zh)
Inventor
袁杭良
谢凡
李志飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Robotics Co Ltd
Original Assignee
Goertek Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Robotics Co Ltd filed Critical Goertek Robotics Co Ltd
Priority to CN202311699037.3A priority Critical patent/CN117873148A/en
Publication of CN117873148A publication Critical patent/CN117873148A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an unmanned aerial vehicle control method, an unmanned aerial vehicle control system, electronic equipment and a computer readable storage medium, and relates to the technical field of unmanned aerial vehicles. The unmanned aerial vehicle control method is applied to a first ground station and comprises the following steps of: receiving flight data and shooting data of the unmanned aerial vehicle, and sending the flight data and the shooting data to a second ground station; receiving a first operation instruction issued by a first user aiming at the flight data and the shooting data and a second operation instruction transmitted by the second ground station; and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle so as to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle. The unmanned aerial vehicle control method and the unmanned aerial vehicle control system solve the technical problem that the operation complexity of an existing unmanned aerial vehicle control method under a complex working condition is high.

Description

Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle control method, an unmanned aerial vehicle control system, electronic equipment and a computer readable storage medium.
Background
Along with the development of unmanned aerial vehicle technology, the application field of unmanned aerial vehicle is also increasingly wide at present, such as fields such as plant protection, commodity circulation, long-distance inspection.
Current drone control schemes typically operate the drone by a human fly (i.e., a drone flight controller) by remote control, and observe and manipulate cameras on the drone through the FPV (First Person View, first person main viewing angle). However, when the unmanned aerial vehicle executes complex conditions such as high-voltage power grid inspection and searching for a working environment, the single operation of the flying hand can cause difficulty in considering and influence the completion of a work task, namely, the operation complexity of the existing unmanned aerial vehicle control method under the complex working condition is higher.
Disclosure of Invention
The main aim of the application is to provide an unmanned aerial vehicle control method, which aims to solve the technical problem that the operation complexity of the existing unmanned aerial vehicle control method under complex working conditions is high.
To achieve the above object, in a first aspect, the present application provides a method for controlling a drone, which is applied to a first ground station, the method for controlling a drone including the steps of:
receiving flight data and shooting data of the unmanned aerial vehicle, and sending the flight data and the shooting data to a second ground station;
Receiving a first operation instruction issued by a first user aiming at the flight data and the shooting data and a second operation instruction transmitted by the second ground station;
and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle so as to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle.
According to a first aspect, the step of sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle includes:
the first operation instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude of the unmanned aerial vehicle according to the first operation instruction;
and sending the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
According to a first aspect, or any implementation manner of the first aspect, after the step of sending the flight data and the shooting data to a second ground station, the method further includes:
receiving an air route adjustment instruction sent by the second ground station;
And sending the route adjustment instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle can adjust the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
According to a first aspect, or any implementation manner of the first aspect, the step of receiving flight data and shooting data of the unmanned aerial vehicle includes:
using a GSstreamer library to receive shooting video sent to the appointed UDP multicast address by the unmanned aerial vehicle through the appointed UDP multicast address, and decoding and displaying the shooting video;
and reading the communication module of the first ground station through GPIO, and analyzing the Mavlink data sent by the unmanned aerial vehicle to obtain the flight data of the unmanned aerial vehicle.
According to a first aspect, or any implementation manner of the first aspect, the step of decoding and displaying the captured video includes:
removing the head information of the RTP data packet corresponding to the shooting video through an rtph264 delay element;
analyzing an H.264 video stream corresponding to the shot video through an h264 burst element;
selecting and creating an adaptive decoder according to the photographed video through a decoder bin element;
performing video format conversion on the shooting video through a video overt element, and converting the decoded shooting video into an RGB format;
The photographed video in RGB format is displayed on a screen through an autovideo element.
According to a first aspect, or any implementation manner of the first aspect, before the step of sending the flight data and the shooting data to a second ground station, the method includes:
modifying the source code of the operating system of the first ground station to obtain the super user identity;
based on the super user identity, connecting a network port eth1 of the first ground station with a communication module of the first ground station through a preset network bridge tool, connecting a network port eth0 of the first ground station with a second ground station, and bridging an eth1 network interface and an eth0 network interface into br0 so as to establish a br0 network bridge between the first ground station and the second ground station.
In a second aspect, the present application provides a method for controlling a drone, applied to a second ground station, the method for controlling a drone including the steps of:
receiving flight data and shooting data of the unmanned aerial vehicle forwarded by a first ground station;
receiving a second operation instruction issued by a second user aiming at the flight data and the shooting data;
forwarding the second operation instruction to the unmanned aerial vehicle through a first ground station, so that the unmanned aerial vehicle controls the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction of the first ground station.
According to a second aspect, after the step of receiving the flight data and the shooting data of the unmanned aerial vehicle forwarded by the first ground station, the method further comprises:
receiving an air route adjustment instruction issued by a second user aiming at the flight data and the shooting data;
forwarding the route adjustment instruction to the unmanned aerial vehicle through a first ground station, so that the unmanned aerial vehicle adjusts the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
In a third aspect, the present application provides a control method of an unmanned aerial vehicle, applied to the unmanned aerial vehicle, the control method of the unmanned aerial vehicle includes the following steps:
acquiring flight data of the unmanned aerial vehicle, and shooting the current surrounding environment of the unmanned aerial vehicle to obtain shooting data;
transmitting the flight data and the shooting data to a first ground station, so that the first ground station forwards the flight data and the shooting data to a second ground station;
receiving a first operation instruction of a first ground station and a second operation instruction transmitted by the second ground station forwarded by the first ground station;
and controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction.
According to a third aspect, the step of controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction includes:
according to the first operation instruction, controlling the current flight attitude of the unmanned aerial vehicle;
and controlling the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
According to a third aspect, or any implementation manner of the above third aspect, after the step of sending the flight data and the shooting data to a first ground station, so that the first ground station forwards the flight data and the shooting data to a second ground station, the method further includes:
receiving an air route adjustment instruction sent by the second ground station forwarded by the first ground station;
and adjusting the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
In a fourth aspect, the present application provides a drone control system, the drone control system comprising: the unmanned aerial vehicle comprises an unmanned aerial vehicle, a first ground station and a second ground station;
the unmanned aerial vehicle is used for collecting flight data of the unmanned aerial vehicle, shooting the current surrounding environment of the unmanned aerial vehicle to obtain shooting data, and sending the flight data and the shooting data to a first ground station;
The first ground station is used for receiving flight data and shooting data of the unmanned aerial vehicle and sending the flight data and the shooting data to the second ground station;
the second ground station is used for receiving flight data and shooting data of the unmanned aerial vehicle forwarded by the first ground station, receiving a second operation instruction issued by a second user aiming at the flight data and the shooting data, and sending the second operation instruction to the first ground station;
the first ground station is used for receiving a first operation instruction sent by a first user aiming at the flight data and the shooting data and a second operation instruction sent by the second ground station, and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle;
the unmanned aerial vehicle is used for controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction.
According to a fourth aspect, the unmanned aerial vehicle control system further comprises:
the second ground station is used for receiving route adjustment instructions issued by a second user aiming at the flight data and the shooting data and sending the route adjustment instructions to the first ground station;
The first ground station is used for receiving the route adjustment instruction sent by the second ground station and sending the route adjustment instruction to the unmanned aerial vehicle;
the unmanned aerial vehicle is used for adjusting the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
In a fifth aspect, the present application provides an electronic device, including: a memory, a processor, the memory having stored thereon a computer program executable on the processor, the computer program being configured to implement the steps of the drone control method as described above.
In a sixth aspect, the present application provides a computer readable storage medium having stored therein a computer program which, when executed by a processor, causes the processor to perform the unmanned aerial vehicle control method as described in the first aspect or any of the possible implementations of the first aspect.
In a seventh aspect, embodiments of the present application provide a computer program comprising instructions for performing the unmanned aerial vehicle control method of the first aspect and any possible implementation of the first aspect.
The application provides an unmanned aerial vehicle control method which is applied to a first ground station, and the flight data and shooting data of an unmanned aerial vehicle are received and sent to a second ground station; receiving a first operation instruction issued by a first user aiming at the flight data and the shooting data and a second operation instruction transmitted by the second ground station; and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle so as to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle. The method comprises the steps that a first ground station and a second ground station are in station-to-station cooperation, wherein one ground station is responsible for controlling the current flight attitude of the unmanned aerial vehicle; the other ground station is responsible for controlling the current shooting attitude of the unmanned aerial vehicle. Therefore, under the separation of control functions and the cooperation of the first ground station and the second ground station, the operation complexity of the unmanned aerial vehicle under complex working conditions is effectively reduced, the control and inspection efficiency of the unmanned aerial vehicle during flight beyond the line of sight can be improved, and the flight task of the unmanned aerial vehicle can be more efficiently completed.
Drawings
Fig. 1 is a schematic flow chart of a first embodiment of a control method of a drone of the present application;
fig. 2 is a schematic flow chart of a second embodiment of the unmanned aerial vehicle control method of the present application;
fig. 3 is a schematic flow chart of a third embodiment of a control method of the unmanned aerial vehicle of the present application;
fig. 4 is a schematic structural diagram of a control system of a unmanned aerial vehicle according to an embodiment of the present application;
fig. 5 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
The realization, functional characteristics and advantages of the present application will be further described with reference to the embodiments, referring to the attached drawings.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
To facilitate an understanding of the drone control method of the present application, the following is an explanation of the relevant terminology involved in the present application:
NX (NVIDIA Jetson Xavier NX) is an embedded artificial intelligence computing platform that provides high performance artificial intelligence computing capabilities for embedded systems and edge devices. The method is widely applied to the fields of automatic driving, intelligent robots, industrial automation, medical imaging, security monitoring and the like. Its high performance and low power consumption make it the first choice in edge computing devices.
GStreamer is a functionally rich open source multimedia framework that provides powerful tools and libraries for building and processing audio and video streams, supports the retrieval of audio and video data from a variety of sources (e.g., cameras, microphones, files), and allows for encoding, decoding, conversion, blending, segmentation, transmission, and rendering. Its flexibility and extensibility enables developers to easily build a variety of multimedia applications.
MAVLink (Micro Air Vehicle Link, micro air vehicle link communication protocol) is a lightweight, connectionless communication protocol intended to provide a standardized message format for communications between the drone and the ground station. The system is an asynchronous serial communication protocol and can be used for realizing two-way communication between the unmanned aerial vehicle and the ground station so as to transmit state information, control instructions and other data. Through MAVLink instruction, ground station can carry out two-way communication with unmanned aerial vehicle to realize functions such as state monitoring, parameter setting, track planning, remote control operation.
RTP (Real-time Transport Protocol ) is an application layer protocol for transmitting audio, video and other Real-time data over IP networks. It provides mechanisms such as time stamping, sequence numbers, synchronization sources, etc. so that the receiver can reconstruct the data in sequence and achieve synchronized playback.
UDP (User Datagram Protocol ) is a transport layer protocol that provides a simple, connectionless data transfer service. UDP performs transmission by dividing data into small packets and adding a source port and a destination port. Since there is no excessive control overhead, the UDP has low delay and small bandwidth occupation, and is suitable for real-time application scenes. RTP typically uses UDP as its underlying transport protocol in order to achieve fast transmission of real-time data. UDP provides a simple transmission mechanism, so that real-time data such as video can be transmitted with low delay, and is suitable for applications such as real-time communication, streaming media and video conference.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of a control method of a drone of the present application. It should be noted that although a logical order is depicted in the flowchart, in some cases the steps depicted or described may be performed in a different order than presented herein.
The first embodiment of the application provides a control method of an unmanned aerial vehicle, which is applied to a first ground station, and comprises the following steps:
step S100, receiving flight data and shooting data of an unmanned aerial vehicle, and sending the flight data and the shooting data to a second ground station;
In this embodiment, it should be noted that the first ground station may be a remote control terminal, a smart phone, a tablet computer, or other devices capable of communicating with the unmanned aerial vehicle. The second ground station may be a PC (Personal Computer ), a tablet computer, a server, a network device, or the like. The flight data may include data relating to a flight status of the unmanned aerial vehicle, such as a flight attitude, a current position, a current speed, and the like of the unmanned aerial vehicle. The shooting data are image data obtained by shooting the current surrounding environment of the unmanned aerial vehicle under the current flight attitude and the current shooting attitude.
The embodiment can receive flight data and shooting data of the unmanned aerial vehicle based on a first communication link (such as a Mavlink communication link, a UDP protocol communication link and the like) between the first ground station and the unmanned aerial vehicle through the first communication link. And transmitting the flight data and the shot data to a second ground station. It can be appreciated that the first ground station may display the flight data and the shooting data through a display device, so that a first user can know the flight state and the current surrounding environment of the unmanned aerial vehicle.
In some embodiments, the step of receiving the flight data and the shooting data of the unmanned aerial vehicle in step S100 includes:
step S110, using a GSstreamer library to receive a shooting video sent to the appointed UDP multicast address by the unmanned aerial vehicle through the appointed UDP multicast address, and decoding and displaying the shooting video;
and step S120, reading the communication module of the first ground station through GPIO, and analyzing the Mavlink data sent by the unmanned aerial vehicle to obtain the flight data of the unmanned aerial vehicle.
Illustratively, the first ground station may receive a video stream of a captured video sent by the drone to a specified UDP (User Datagram Protocol ) multicast address using the GStreamer library, and decode and display the video stream of the captured video. The first ground station may also read and parse Mavlink (Micro Air Vehicle Link, micro air vehicle link communication protocol) data (i.e. flight data) sent by the unmanned aerial vehicle from the communication module through GPIO (General-Purpose Input/output), and display information such as the attitude and the position of the unmanned aerial vehicle. The first ground station may set the attribute of the udpsc element, specify a UDP multicast address (e.g., 255.255.255) through multicast-group, set auto-multicast to automatically join the UDP multicast group, and specify a UDP port number (e.g., 8555) and a network interface (eth 1) to receive multicast data (i.e., shot data).
In some embodiments, the step of decoding and displaying the captured video in step S110 includes:
step S111, removing header information of the RTP data packet corresponding to the shot video through a rtph264 delay element;
step S112, analyzing the H.264 video stream corresponding to the shot video through an h264 parameter element;
step S113, selecting and creating an adaptive decoder according to the photographed video through a decoder bin element;
step S114, converting the video format of the photographed video through a video over element, and converting the decoded photographed video into RGB format;
in step S115, the photographed video in RGB format is displayed on the screen through the autovideolink element.
Illustratively, in this embodiment, header information of an RTP packet corresponding to the captured video may be removed by an rtph264 delay element; analyzing the H.264 video stream corresponding to the photographed video after removing the head information of the RTP data packet through the h264 burst element; selecting and creating an adaptive decoder according to the type of the shot video through a decoder bin element to decode an H.264 video stream corresponding to the shot video; performing video format conversion on the shooting video through a video overt element, and converting the decoded shooting video into an RGB format; the photographed video in RGB format is displayed on a screen through an autovideo element. It will be appreciated that the rtph264 delay element, h264 parameter element, decoder bin element, video overt element, and auto video link element are all GStreamer elements in the GStreamer library, and the GStreamer element is a special GObject, which is an entity that provides attribute functions for the GStreamer library.
In some embodiments, before the step of transmitting the flight data and the photographing data to the second ground station in step S100, the method further includes:
step S130, modifying the source code of the operating system of the first ground station to acquire the super user identity;
step S140, based on the super user identity, connecting the network port eth1 of the first ground station to the communication module of the first ground station through a preset bridge tool, connecting the network port eth0 of the first ground station to the second ground station, and bridging the eth1 network interface and the eth0 network interface to be br0, so as to establish a br0 bridge between the first ground station and the second ground station.
The first ground station in this embodiment may send the flight data and the shooting data to the second ground station through a second communication link (e.g., a bridge) that is established in advance. Illustratively, the first ground station integrates bridge tools bridge-utes, and simultaneously, by modifying the source code of the operating system (such as an android system) of the first ground station, an application program can call a su instruction (/ system/xbin/su) to acquire the operation authority of the related file, wherein the su instruction is an operation instruction for switching to a super user (super user) identity, and is also called a root user. Thus, the network port eth1 of the first ground station is connected with the communication module of the first ground station, the network port eth0 is connected with the second ground station, and then the application program can run network configuration commands such as ifconfig, ip rule, brctl and the like to bridge the eth1 network interface and the eth0 network interface into br0. The brctl command is an abbreviation of "bridge control" and its function is to manage the ethernet bridge. The brctl command can set, maintain, and check ethernet bridge configuration parameters in the system kernel. Data transmission between the first ground station and the second ground station can thus be achieved in the operating system of the first ground station by establishing a br0 bridge. The second ground station can monitor the appointed UDP port number through the br0 network bridge, and acquire the RTP video stream in real time for decoding and playing, so that the second ground station and the first ground station realize synchronous playing of video images of the shot video.
Step 200, receiving a first operation instruction issued by a first user aiming at the flight data and the shooting data and a second operation instruction sent by the second ground station;
in this embodiment, the first operation instruction is an operation instruction issued by the first user for the flight data and the shooting data, where the first operation instruction is used to control a current flight posture and/or a current shooting posture of the unmanned aerial vehicle. The first user is a user who controls the unmanned aerial vehicle by operating the first ground station. The second operation instruction is an operation instruction issued by a second user aiming at the flight data and the shooting data, and the second operation instruction is used for controlling the current flight attitude and/or the current shooting attitude of the unmanned aerial vehicle. The second user is a user controlling the unmanned aerial vehicle by operating a second ground station.
The embodiment can receive the first operation instruction issued by the first user for the flight data and the shooting data through an input device (such as a touch screen, a keyboard and the like). And receiving a second operation instruction sent by the second ground station through a pre-established second communication link, wherein the second operation instruction is an operation instruction aiming at the flight data and the shooting data and issued by a second user through the second ground station and is used for controlling the current flight attitude and/or the current shooting attitude of the unmanned aerial vehicle.
And step S300, the first operation instruction and the second operation instruction are sent to the unmanned aerial vehicle so as to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle.
In this embodiment, the current flight attitude is an unmanned aerial vehicle attitude of the unmanned aerial vehicle under flight parameters such as a pitch angle, a yaw angle, a roll angle, and the like, for example. The current shooting gesture is a tripod head camera gesture of the tripod head camera arranged on the unmanned aerial vehicle under shooting parameters such as tripod head steering, camera Zoom and the like.
After receiving a first operation instruction issued by a first user and a second operation instruction sent by the second ground station, the embodiment can send the first operation instruction and the second operation instruction to the unmanned aerial vehicle through the first communication link, so that the unmanned aerial vehicle can adjust flight parameters of the unmanned aerial vehicle and shooting parameters of a cradle head camera arranged on the unmanned aerial vehicle according to the first operation instruction and the second operation instruction, and control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle.
As an example, the first operation instruction may be an operation instruction for controlling a current flight attitude of the unmanned aerial vehicle, and the second operation instruction may be an operation instruction for controlling a current shooting attitude of the unmanned aerial vehicle. The first ground station can send the first operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude of the unmanned aerial vehicle according to the first operation instruction; and sending the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
As another example, the first operation instruction may be an operation instruction for controlling a current shooting attitude of the unmanned aerial vehicle, and the second operation instruction may be an operation instruction for controlling a current flight attitude of the unmanned aerial vehicle. The first ground station can send the first operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current shooting gesture of the unmanned aerial vehicle according to the first operation instruction; and sending the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude of the unmanned aerial vehicle according to the second operation instruction.
In some embodiments, the step of sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle in step S300 to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle includes:
step S310, the first operation instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude of the unmanned aerial vehicle according to the first operation instruction;
step S320, sending the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
In this embodiment, it should be noted that the first operation instruction is an operation instruction for controlling a current shooting gesture of the unmanned aerial vehicle, and the second operation instruction is an operation instruction for controlling a current flight gesture of the unmanned aerial vehicle.
Because the first ground station is directly in communication connection with the unmanned aerial vehicle, the second ground station is required to be transferred through the first ground station to interact with the unmanned aerial vehicle, and the requirement on the real-time performance of the current flight attitude control of the unmanned aerial vehicle is high under complex conditions. Therefore, in order to ensure the real-time performance of the current flight attitude control of the unmanned aerial vehicle, the embodiment sends the first operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude of the unmanned aerial vehicle according to the first operation instruction; and sending the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction. Therefore, in the embodiment, the first ground station is used for controlling the current flight attitude of the unmanned aerial vehicle, so that the real-time performance of the current flight attitude control of the unmanned aerial vehicle is ensured, and the unmanned aerial vehicle can adapt to the flight task under the complex condition.
In some embodiments, after the step of transmitting the flight data and the photographing data to the second ground station in step S100, the method further comprises:
step A10, receiving an air route adjustment instruction sent by the second ground station;
and step A20, sending the route adjustment instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle can adjust the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
In this embodiment, it should be noted that the heading adjustment instruction is an operation instruction for adjusting a course parameter of the unmanned aerial vehicle, where the course parameter includes a waypoint, a course center line, a flight altitude, a take-off speed, a course speed, and the like.
Because of the high demands on the change in the current flight attitude of the unmanned aerial vehicle in complex situations, the first user operating the first ground station needs to be responsible for controlling the current flight attitude of the unmanned aerial vehicle, and therefore it is often difficult for the first user to take account of other tasks. In this embodiment, the second ground station may perform the course adjustment by receiving a course adjustment instruction issued by the second user for the flight data and the shooting data, and then send the course adjustment instruction to the first ground station through the second communication link. After the first ground station receives the route adjustment instruction sent by the second ground station through the second communication link, the route adjustment instruction can be sent to the unmanned aerial vehicle through the first communication link, so that the unmanned aerial vehicle adjusts the route parameters of the unmanned aerial vehicle according to the route adjustment instruction. The embodiment is completed by distributing the course adjustment work to the second ground station, so that the first ground station can be ensured to be focused on controlling the current flight attitude of the unmanned aerial vehicle, and therefore, the unmanned aerial vehicle can adapt to the flight task under the complex condition.
In a first embodiment of the present application, a control method of an unmanned aerial vehicle is provided, which is applied to a first ground station, by receiving flight data and shooting data of the unmanned aerial vehicle, and transmitting the flight data and the shooting data to a second ground station; receiving a first operation instruction issued by a first user aiming at the flight data and the shooting data and a second operation instruction transmitted by the second ground station; and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle so as to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle. In this embodiment, the first ground station and the second ground station are in station-to-station cooperation, wherein one ground station is responsible for controlling the current flight attitude of the unmanned aerial vehicle; the other ground station is responsible for controlling the current shooting attitude of the unmanned aerial vehicle. Therefore, under the separation of control functions and the cooperation of the first ground station and the second ground station, the operation complexity of the unmanned aerial vehicle under complex working conditions is effectively reduced, the control and inspection efficiency of the unmanned aerial vehicle during flight beyond the line of sight can be improved, and the flight task of the unmanned aerial vehicle can be more efficiently completed.
Referring to fig. 2, fig. 2 is a schematic flow chart of a second embodiment of the control method of the unmanned aerial vehicle of the present application.
In another embodiment of the present application, the same or similar content as the above embodiment may be referred to the above description, and will not be repeated. The second embodiment of the application provides a control method of an unmanned aerial vehicle, which is applied to a second ground station, and comprises the following steps:
step S400, receiving flight data and shooting data of the unmanned aerial vehicle forwarded by a first ground station;
step S500, a second operation instruction issued by a second user aiming at the flight data and the shooting data is received;
and step S600, forwarding the second operation instruction to the unmanned aerial vehicle through a first ground station, so that the unmanned aerial vehicle controls the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction of the first ground station.
In this embodiment, it should be noted that the second operation instruction is an operation instruction issued by the second user for the flight data and the shooting data, where the second operation instruction is used to control a current flight posture and/or a current shooting posture of the unmanned aerial vehicle. The second user is a user controlling the unmanned aerial vehicle by operating a second ground station.
The second ground station is connected with the first ground station through a second communication link and is used for receiving flight data and shooting data of the unmanned aerial vehicle forwarded by the first ground station. And then the second ground station can display the flight data and the shooting data through a display device so that a second user can know the flight state and the current surrounding environment of the unmanned aerial vehicle. And further, a second operation instruction issued by a second user for the flight data and the shooting data can be received through an input device (such as a touch screen, a keyboard and the like). And then, the second operation instruction can be sent to the first ground station through a second communication link, and the first ground station forwards the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction of the first ground station.
In some embodiments, after the step of receiving the flight data and the shooting data of the unmanned aerial vehicle forwarded by the first ground station in step S400, the method further includes:
step B10, receiving route adjustment instructions issued by a second user aiming at the flight data and the shooting data;
And step B20, forwarding the route adjustment instruction to the unmanned aerial vehicle through a first ground station so that the unmanned aerial vehicle can adjust the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
In this embodiment, it should be noted that the heading adjustment instruction is an operation instruction for adjusting a course parameter of the unmanned aerial vehicle, where the course parameter includes a waypoint, a course center line, a flight altitude, a take-off speed, a course speed, and the like.
Because the real-time control requirement for the unmanned aerial vehicle is higher under the complex condition, the first ground station can be focused on the control of the unmanned aerial vehicle for ensuring. In this embodiment, the second ground station may perform the course adjustment by receiving a course adjustment instruction issued by the second user for the flight data and the shooting data, and then send the course adjustment instruction to the first ground station through the second communication link. After the first ground station receives the route adjustment instruction sent by the second ground station through the second communication link, the route adjustment instruction can be sent to the unmanned aerial vehicle through the first communication link, so that the unmanned aerial vehicle adjusts the route parameters of the unmanned aerial vehicle according to the route adjustment instruction. The embodiment is completed by distributing the course adjustment work to the second ground station, so that the first ground station can be ensured to be focused on controlling the unmanned aerial vehicle, and therefore, the unmanned aerial vehicle can be suitable for flight tasks under complex conditions.
The second embodiment of the application provides a control method of an unmanned aerial vehicle, which is applied to a second ground station and is used for receiving flight data and shooting data of the unmanned aerial vehicle forwarded by a first ground station; receiving a second operation instruction issued by a second user aiming at the flight data and the shooting data; forwarding the second operation instruction to the unmanned aerial vehicle through a first ground station, so that the unmanned aerial vehicle controls the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction of the first ground station. According to the unmanned aerial vehicle flight control system, the operation work of a part of first ground stations on the unmanned aerial vehicle is shared through the second ground stations, so that the unmanned aerial vehicle effectively reduces the operation complexity of the unmanned aerial vehicle under complex working conditions under the separation of control functions and the cooperation of the first ground stations and the second ground stations, and the control and inspection efficiency of the unmanned aerial vehicle during flight beyond the line of sight can be improved, and the unmanned aerial vehicle flight task can be completed more efficiently.
Referring to fig. 3, fig. 3 is a schematic flow chart of a third embodiment of the control method of the drone of the present application.
In another embodiment of the present application, the same or similar content as the above embodiment may be referred to the above description, and will not be repeated. The third embodiment of the application provides an unmanned aerial vehicle control method, which is applied to an unmanned aerial vehicle, and comprises the following steps:
Step S700, acquiring flight data of the unmanned aerial vehicle, and shooting the current surrounding environment of the unmanned aerial vehicle to obtain shooting data;
step S800 of transmitting the flight data and the photographing data to a first ground station, so that the first ground station forwards the flight data and the photographing data to a second ground station;
step S900, a first operation instruction of a first ground station and a second operation instruction transmitted by the second ground station forwarded by the first ground station are received;
and step S1000, controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction.
In this embodiment, it should be noted that a pan-tilt camera is disposed on the unmanned aerial vehicle. The flight data may include data relating to a flight status of the unmanned aerial vehicle, such as a flight attitude, a current position, a current speed, and the like of the unmanned aerial vehicle. The shooting data are image data obtained by shooting the current surrounding environment of the unmanned aerial vehicle under the current flight attitude and the current shooting attitude.
According to the embodiment, the flight data of the unmanned aerial vehicle can be acquired through the sensor on the unmanned aerial vehicle, and the current surrounding environment of the unmanned aerial vehicle in the current flight attitude and the current shooting attitude is shot through the cradle head camera arranged on the unmanned aerial vehicle, so that shooting data are obtained. The flight data and the photographed data may then be transmitted to a first ground station over a first communication link between the first ground station and the drone, such that the first ground station forwards the flight data and the photographed data to a second ground station over a second communication link. And may receive a first operation instruction transmitted by a first ground station via a first communication link and a second operation instruction transmitted by the second ground station via the first ground station. And then the unmanned aerial vehicle can control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction.
As an example, the first operation instruction may be an operation instruction for controlling a current flight attitude of the unmanned aerial vehicle, and the second operation instruction may be an operation instruction for controlling a current shooting attitude of the unmanned aerial vehicle. Therefore, the unmanned aerial vehicle can control the current flight attitude of the unmanned aerial vehicle according to the first operation instruction; the unmanned aerial vehicle can control the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction. As another example, the first operation instruction may be an operation instruction for controlling a current shooting attitude of the unmanned aerial vehicle, and the second operation instruction may be an operation instruction for controlling a current flight attitude of the unmanned aerial vehicle. Therefore, the unmanned aerial vehicle can control the current shooting gesture of the unmanned aerial vehicle according to the first operation instruction; the unmanned aerial vehicle can control the current flight attitude of the unmanned aerial vehicle according to the second operation instruction.
For example, the drone may be initialized and started first, and the drone and the first ground station may be made to establish a first communication link (such as a Mavlink communication link, a UDP protocol communication link, etc.) through a communication module pairing. After the unmanned aerial vehicle is started, an unmanned aerial vehicle operating system (such as an NX Ubuntu system) automatically logs in and runs a preset script, wherein the preset script is a script which is written through a GStreamer library to realize acquisition, coding and transmission of video streams. The unmanned aerial vehicle runs the preset script and is used for connecting a tripod head camera, and video data (namely the shooting video) are read from a/dev/video 0 device (namely the tripod head camera); setting a video encoder to x264enc and enabling a zero delay mode; packaging the video data after H264 coding into RTP protocol, setting the transmission interval of the configuration frame as-1, and setting the maximum transmission unit as 1400 bytes; setting a data queue leak=down stream, which means that if the downstream does not process data in time, some data are discarded and the data remain smooth; video data is sent via the UDP protocol to a specified UDP port number (e.g., 8555 port) specifying a UDP multicast address (255.255.255.255). The unmanned aerial vehicle is used for collecting flight data such as the attitude and the position of an airplane, sending the flight data to the first ground station through a first communication link, and receiving a Mavlink protocol instruction sent by the first ground station through the first communication link. It will be appreciated that the first ground station may encapsulate all manipulation commands (i.e. the first operation command and the second operation command) into a Mavlink protocol command, and send the Mavlink protocol command to the unmanned aerial vehicle through the first communication link.
In some embodiments, the step of controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction in step S1000 includes:
step S1010, controlling a current flight attitude of the unmanned aerial vehicle according to the first operation instruction;
step S1020, controlling the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
Because the first ground station is directly in communication connection with the unmanned aerial vehicle, the second ground station is required to be transferred through the first ground station to interact with the unmanned aerial vehicle, and the requirement on the real-time performance of the current flight attitude control of the unmanned aerial vehicle is high under complex conditions. Therefore, in order to ensure the real-time performance of the current flight attitude control of the unmanned aerial vehicle, the first ground station is used for controlling the current flight attitude of the unmanned aerial vehicle, so that the real-time performance of the current flight attitude control of the unmanned aerial vehicle is ensured, and the unmanned aerial vehicle can adapt to the flight task under the complex condition. The unmanned aerial vehicle can control the current flight attitude of the unmanned aerial vehicle according to the first operation instruction; the unmanned aerial vehicle can control the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
In some embodiments, after the step of transmitting the flight data and the photographing data to a first ground station in step S800, so that the first ground station forwards the flight data and the photographing data to a second ground station, the method further comprises:
step C10, receiving an air route adjustment instruction sent by the second ground station forwarded by the first ground station;
and step C20, adjusting the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
In this embodiment, it should be noted that the heading adjustment instruction is an operation instruction for adjusting a course parameter of the unmanned aerial vehicle, where the course parameter includes a waypoint, a course center line, a flight altitude, a take-off speed, a course speed, and the like.
Because of the high demands on the change in the current flight attitude of the unmanned aerial vehicle in complex situations, the first user operating the first ground station needs to be responsible for controlling the current flight attitude of the unmanned aerial vehicle, and therefore it is often difficult for the first user to take account of other tasks. In this embodiment, the second ground station may perform the course adjustment by receiving a course adjustment instruction issued by the second user for the flight data and the shooting data, and then send the course adjustment instruction to the first ground station through the second communication link. After the first ground station receives the route adjustment instruction sent by the second ground station through the second communication link, the route adjustment instruction can be sent to the unmanned aerial vehicle through the first communication link, so that the unmanned aerial vehicle adjusts the route parameters of the unmanned aerial vehicle according to the route adjustment instruction. The embodiment is completed by distributing the course adjustment work to the second ground station, so that the first ground station can be ensured to be focused on controlling the current flight attitude of the unmanned aerial vehicle, and therefore, the unmanned aerial vehicle can adapt to the flight task under the complex condition.
In a third embodiment of the present invention, an unmanned aerial vehicle control method is provided and applied to an unmanned aerial vehicle, and shooting is performed on a current surrounding environment of the unmanned aerial vehicle by collecting flight data of the unmanned aerial vehicle to obtain shooting data; transmitting the flight data and the shooting data to a first ground station, so that the first ground station forwards the flight data and the shooting data to a second ground station; receiving a first operation instruction of a first ground station and a second operation instruction transmitted by the second ground station forwarded by the first ground station; and controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction. In this embodiment, the first ground station and the second ground station are in station-to-station cooperation, wherein one ground station is responsible for controlling the current flight attitude of the unmanned aerial vehicle; the other ground station is responsible for controlling the current shooting attitude of the unmanned aerial vehicle. Therefore, under the separation of control functions and the cooperation of the first ground station and the second ground station, the operation complexity of the unmanned aerial vehicle under complex working conditions is effectively reduced, the control and inspection efficiency of the unmanned aerial vehicle during flight beyond the line of sight can be improved, and the flight task of the unmanned aerial vehicle can be more efficiently completed.
Referring to fig. 4, fig. 4 is a schematic structural diagram of the unmanned aerial vehicle control system of the present application.
As shown in fig. 4, the present application provides a control system for a drone, the control system for a drone 10 comprising: the drone 10, the first ground station 20, and the second ground station 30;
the unmanned aerial vehicle 10 is configured to collect flight data of the unmanned aerial vehicle 10, shoot a current surrounding environment of the unmanned aerial vehicle 10, obtain shooting data, and send the flight data and the shooting data to the first ground station 20;
a first ground station 20 for receiving flight data and photographing data of the unmanned aerial vehicle 10 and transmitting the flight data and the photographing data to a second ground station 30;
the second ground station 30 is configured to receive flight data and shooting data of the unmanned aerial vehicle 10 forwarded by the first ground station 20, receive a second operation instruction issued by a second user for the flight data and the shooting data, and send the second operation instruction to the first ground station 20;
the first ground station 20 is configured to receive a first operation instruction issued by a first user for the flight data and the shooting data and a second operation instruction sent by the second ground station 30, and send the first operation instruction and the second operation instruction to the unmanned aerial vehicle 10;
The unmanned aerial vehicle 10 is configured to control a current flight attitude and a current shooting attitude of the unmanned aerial vehicle 10 according to the first operation instruction and the second operation instruction.
In this embodiment, the unmanned aerial vehicle 10 may include a first communication module 11, a flight control module 12, and a pan-tilt camera 13. The unmanned aerial vehicle 10 acquires flight data of the unmanned aerial vehicle 10 through the flight control module 12, shoots the current surrounding environment of the unmanned aerial vehicle 10 through the pan-tilt camera 13, obtains shooting data, and sends the flight data and the shooting data to the first ground station 20 through the first communication module 11. The first ground station 20 includes a second communication module 21, a remote sensing control module 22 and a third communication module 23, and the first ground station 20 receives the flight data and the photographing data of the unmanned aerial vehicle 10 through the second communication module 21 and transmits the flight data and the photographing data to the second ground station 30 through the third communication module 23. The second ground station 30 includes a fourth communication module, and the second ground station 30 receives the flight data and the photographing data of the unmanned aerial vehicle 10 forwarded by the first ground station 20 through the fourth communication module 31. A second operation instruction issued by a second user for the flight data and the photographing data is received through an input device, and is transmitted to the first ground station 20 through the fourth communication module 31. The first ground station 20 receives a first operation instruction issued by a first user for the flight data and the shooting data through the remote sensing control module 22, and receives a second operation instruction sent by the second ground station 30 through the fourth communication module 31 through the third communication module 23. The first ground station 20 transmits the first operation instruction and the second operation instruction to the unmanned aerial vehicle 10 through the second communication module 21. The unmanned aerial vehicle 10 receives the first operation instruction and the second operation instruction sent by the first ground station through the second communication module 21 through the first communication module 11, and controls the current flight attitude and the current shooting attitude of the unmanned aerial vehicle 10 through the flight control module 12 and the pan-tilt camera 13 according to the first operation instruction and the second operation instruction.
In some embodiments, the drone 10 control system further includes:
a second ground station 30, configured to receive an airline adjustment instruction issued by a second user for the flight data and the shooting data, and send the airline adjustment instruction to the first ground station 20;
the first ground station 20 is configured to receive the route adjustment instruction sent by the second ground station 30, and send the route adjustment instruction to the unmanned aerial vehicle 10;
the unmanned aerial vehicle 10 is configured to adjust the route parameters of the unmanned aerial vehicle 10 according to the route adjustment instruction.
The second ground station 30 receives, through an input device, an course adjustment instruction issued by the second user for the flight data and the photographing data, and transmits the course adjustment instruction to the first ground station 20. The first ground station 20 receives via the third communication module 23 the course adjustment command sent by the second ground station 30 via the fourth communication module 31. The first ground station 20 sends the course adjustment instruction to the drone 10 through the second communication module 21. The unmanned aerial vehicle 10 is configured to adjust the route parameters of the unmanned aerial vehicle 10 according to the route adjustment instruction.
The unmanned aerial vehicle control system provided by the application adopts the unmanned aerial vehicle control method in each embodiment, and solves the technical problem that the operation complexity of the existing unmanned aerial vehicle control method under the complex working condition is high. Compared with the prior art, the beneficial effects of the unmanned aerial vehicle control system provided by the embodiment of the application are the same as those of the unmanned aerial vehicle control system method provided by the embodiment, and other technical features in the unmanned aerial vehicle control system are the same as those disclosed by the embodiment method, so that details are omitted.
As shown in fig. 5, fig. 5 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
Specifically, the electronic device may be a first ground station, a second ground station or an unmanned aerial vehicle, and the first ground station may be a remote control terminal, a smart phone, a tablet computer or the like, which may communicate with the unmanned aerial vehicle. The second ground station may be a PC (Personal Computer ), a tablet computer, a server, a network device, or the like.
As shown in fig. 5, the electronic device may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the user interface 1003 may further include a standard wired interface, a wireless interface. Alternatively, the network interface 1004 may include a standard wired interface, a WIreless interface (e.g., a WIreless-FIdelity (WI-FI) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) Memory or a stable nonvolatile Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the device structure shown in fig. 5 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 5, an operating system, a network communication module, a user interface module, and a computer program may be included in the memory 1005, which is a type of computer storage medium.
In the device shown in fig. 5, the network interface 1004 is mainly used for connecting to a background server, and is in data communication with the background server; the user interface 1003 is mainly used for connecting a client and communicating data with the client; and the processor 1001 may be configured to invoke a computer program stored in the memory 1005, to implement the operations in the unmanned aerial vehicle control method provided in the above embodiment.
In addition, the embodiment of the present application further provides a computer storage medium, where a computer program is stored on the computer storage medium, and when the computer program is executed by a processor, the operations in the unmanned aerial vehicle control method provided in the foregoing embodiment are implemented, and specific steps are not described in detail herein.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity/operation/object from another entity/operation/object without necessarily requiring or implying any actual such relationship or order between such entities/operations/objects; the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points. The above-described apparatus embodiments are merely illustrative, in which the units illustrated as separate components may or may not be physically separate. Some or all modules in the selection can be selected according to actual needs to achieve the purpose of the scheme. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) as described above, including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, a television, or a network device, etc.) to perform the method described in the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (15)

1. A method of unmanned aerial vehicle control, characterized by being applied to a first ground station, the method comprising the steps of:
receiving flight data and shooting data of the unmanned aerial vehicle, and sending the flight data and the shooting data to a second ground station;
receiving a first operation instruction issued by a first user aiming at the flight data and the shooting data and a second operation instruction transmitted by the second ground station;
and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle so as to control the current flight attitude and the current shooting attitude of the unmanned aerial vehicle.
2. The unmanned aerial vehicle control method of claim 1, wherein the step of sending the first and second operating instructions to the unmanned aerial vehicle to control the current flight attitude and current shooting attitude of the unmanned aerial vehicle comprises:
The first operation instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current flight attitude of the unmanned aerial vehicle according to the first operation instruction;
and sending the second operation instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
3. The drone control method of claim 2, wherein after the step of transmitting the flight data and the shooting data to a second ground station, the method further comprises:
receiving an air route adjustment instruction sent by the second ground station;
and sending the route adjustment instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle can adjust the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
4. The unmanned aerial vehicle control method of claim 1, wherein the step of receiving the flight data and the photographing data of the unmanned aerial vehicle comprises:
using a GSstreamer library to receive shooting video sent to the appointed UDP multicast address by the unmanned aerial vehicle through the appointed UDP multicast address, and decoding and displaying the shooting video;
and reading the communication module of the first ground station through GPIO, and analyzing the Mavlink data sent by the unmanned aerial vehicle to obtain the flight data of the unmanned aerial vehicle.
5. The unmanned aerial vehicle control method of claim 4, wherein the step of decoding and displaying the captured video comprises:
removing the head information of the RTP data packet corresponding to the shooting video through an rtph264 delay element;
analyzing an H.264 video stream corresponding to the shot video through an h264 burst element;
selecting and creating an adaptive decoder according to the photographed video through a decoder bin element;
performing video format conversion on the shooting video through a video overt element, and converting the decoded shooting video into an RGB format;
the photographed video in RGB format is displayed on a screen through an autovideo element.
6. The unmanned aerial vehicle control method of claim 1, comprising, prior to the step of transmitting the flight data and the shot data to a second ground station:
modifying the source code of the operating system of the first ground station to obtain the super user identity;
based on the super user identity, connecting a network port eth1 of the first ground station with a communication module of the first ground station through a preset network bridge tool, connecting a network port eth0 of the first ground station with a second ground station, and bridging an eth1 network interface and an eth0 network interface into br0 so as to establish a br0 network bridge between the first ground station and the second ground station.
7. A method of unmanned aerial vehicle control, characterized by being applied to a second ground station, the method comprising the steps of:
receiving flight data and shooting data of the unmanned aerial vehicle forwarded by a first ground station;
receiving a second operation instruction issued by a second user aiming at the flight data and the shooting data;
forwarding the second operation instruction to the unmanned aerial vehicle through a first ground station, so that the unmanned aerial vehicle controls the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction of the first ground station.
8. The drone control method of claim 7, wherein after the step of receiving the first ground station forwarded drone flight data and the captured data, the method further comprises:
receiving an air route adjustment instruction issued by a second user aiming at the flight data and the shooting data;
forwarding the route adjustment instruction to the unmanned aerial vehicle through a first ground station, so that the unmanned aerial vehicle adjusts the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
9. The unmanned aerial vehicle control method is characterized by being applied to an unmanned aerial vehicle, and comprises the following steps of:
Acquiring flight data of the unmanned aerial vehicle, and shooting the current surrounding environment of the unmanned aerial vehicle to obtain shooting data;
transmitting the flight data and the shooting data to a first ground station, so that the first ground station forwards the flight data and the shooting data to a second ground station;
receiving a first operation instruction of a first ground station and a second operation instruction transmitted by the second ground station forwarded by the first ground station;
and controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction.
10. The unmanned aerial vehicle control method of claim 9, wherein the step of controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction comprises:
according to the first operation instruction, controlling the current flight attitude of the unmanned aerial vehicle;
and controlling the current shooting gesture of the unmanned aerial vehicle according to the second operation instruction.
11. The drone control method of claim 10, wherein after the step of transmitting the flight data and the shooting data to a first ground station to cause the first ground station to forward the flight data and the shooting data to a second ground station, the method further comprises:
Receiving an air route adjustment instruction sent by the second ground station forwarded by the first ground station;
and adjusting the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
12. An unmanned aerial vehicle control system, the unmanned aerial vehicle control system comprising: the unmanned aerial vehicle comprises an unmanned aerial vehicle, a first ground station and a second ground station;
the unmanned aerial vehicle is used for collecting flight data of the unmanned aerial vehicle, shooting the current surrounding environment of the unmanned aerial vehicle to obtain shooting data, and sending the flight data and the shooting data to a first ground station;
the first ground station is used for receiving flight data and shooting data of the unmanned aerial vehicle and sending the flight data and the shooting data to the second ground station;
the second ground station is used for receiving flight data and shooting data of the unmanned aerial vehicle forwarded by the first ground station, receiving a second operation instruction issued by a second user aiming at the flight data and the shooting data, and sending the second operation instruction to the first ground station;
the first ground station is used for receiving a first operation instruction sent by a first user aiming at the flight data and the shooting data and a second operation instruction sent by the second ground station, and sending the first operation instruction and the second operation instruction to the unmanned aerial vehicle;
The unmanned aerial vehicle is used for controlling the current flight attitude and the current shooting attitude of the unmanned aerial vehicle according to the first operation instruction and the second operation instruction.
13. The unmanned aerial vehicle control system of claim 12, further comprising:
the second ground station is used for receiving route adjustment instructions issued by a second user aiming at the flight data and the shooting data and sending the route adjustment instructions to the first ground station;
the first ground station is used for receiving the route adjustment instruction sent by the second ground station and sending the route adjustment instruction to the unmanned aerial vehicle;
the unmanned aerial vehicle is used for adjusting the route parameters of the unmanned aerial vehicle according to the route adjustment instruction.
14. An electronic device, the electronic device comprising: a memory, a processor, the memory having stored thereon a computer program executable on the processor, the computer program, when executed by the processor, implementing the steps of the drone control method of any one of claims 1 to 11.
15. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the drone control method according to any one of claims 1 to 11.
CN202311699037.3A 2023-12-11 2023-12-11 Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium Pending CN117873148A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311699037.3A CN117873148A (en) 2023-12-11 2023-12-11 Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311699037.3A CN117873148A (en) 2023-12-11 2023-12-11 Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN117873148A true CN117873148A (en) 2024-04-12

Family

ID=90587403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311699037.3A Pending CN117873148A (en) 2023-12-11 2023-12-11 Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117873148A (en)

Similar Documents

Publication Publication Date Title
CN109597431B (en) Target tracking method and device
CN108700889B (en) Control method, remote monitoring equipment, remote controller, server and streaming media server
CN112714281A (en) Unmanned aerial vehicle carries VR video acquisition transmission device based on 5G network
JP2005333552A (en) Panorama video distribution system
WO2017166714A1 (en) Method, device, and system for capturing panoramic image
CN110337098B (en) Method and device for establishing communication connection
WO2020155037A1 (en) Multi-load multi-path image transmission method, control system and terminal, and unmanned aerial vehicle and server
CN110012267A (en) Unmanned aerial vehicle (UAV) control method and audio/video data transmission method
CN112104918A (en) Image transmission method and device based on satellite network
CN108696720B (en) Video scheduling system and method suitable for satellite communication
US10666351B2 (en) Methods and systems for live video broadcasting from a remote location based on an overlay of audio
CN210986120U (en) Video networking system
CN117873148A (en) Unmanned aerial vehicle control method, unmanned aerial vehicle control system, electronic equipment and computer readable storage medium
CN110460579B (en) Flight data display method, system and device and readable storage medium
WO2023040984A1 (en) Unmanned aerial vehicle-based multi-channel video live broadcast method, system, device and storage medium
WO2020103018A1 (en) Video processing method, ground control terminal and storage medium
Jin et al. Design of UAV video and control signal real-time transmission system based on 5G network
Zafar et al. Smart phone interface for robust control of mobile robots
WO2019000273A1 (en) Method, device and system for controlling unmanned aerial vehicle, and storage medium
Mademlis et al. Communications for autonomous unmanned aerial vehicle fleets in outdoor cinematography applications
Harms et al. Development of an adaptable communication layer with qos capabilities for a multi-robot system
CN111641803B (en) Earthquake field command system, method and readable storage medium
CN110072089A (en) A kind of method and relevant device of remote transmission image data
CN220603901U (en) Unmanned aerial vehicle based on 4G network
US20210360164A1 (en) Image control method and device, and mobile platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination