CN111679680A - Unmanned aerial vehicle autonomous landing method and system - Google Patents

Unmanned aerial vehicle autonomous landing method and system Download PDF

Info

Publication number
CN111679680A
CN111679680A CN201911421348.7A CN201911421348A CN111679680A CN 111679680 A CN111679680 A CN 111679680A CN 201911421348 A CN201911421348 A CN 201911421348A CN 111679680 A CN111679680 A CN 111679680A
Authority
CN
China
Prior art keywords
unmanned aerial
landing
aerial vehicle
data
autonomous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911421348.7A
Other languages
Chinese (zh)
Inventor
易建军
刘荣
贺亮
相浩杰
张新科
陈李炜
黄昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
East China University of Science and Technology
Shanghai Aerospace Control Technology Institute
Original Assignee
East China University of Science and Technology
Shanghai Aerospace Control Technology Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by East China University of Science and Technology, Shanghai Aerospace Control Technology Institute filed Critical East China University of Science and Technology
Priority to CN201911421348.7A priority Critical patent/CN111679680A/en
Publication of CN111679680A publication Critical patent/CN111679680A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • G05D1/0684Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing on a moving platform, e.g. aircraft carrier

Abstract

The invention discloses an unmanned aerial vehicle autonomous landing method and system, wherein the method comprises the following steps: a mode judging step, namely judging whether the unmanned aerial vehicle is in a task mode or a target searching mode in real time; a waypoint moving step, namely controlling the unmanned aerial vehicle to navigate to a designated landing area according to a route and switching to a target searching mode when the unmanned aerial vehicle is in the task mode; identifying a target, namely controlling the unmanned aerial vehicle to search a correct carrier landing identifier when the unmanned aerial vehicle is in the target searching mode; guiding carrier landing, and controlling the unmanned aerial vehicle to gradually approach the carrier landing identifier according to a fusion algorithm; and a landing step, controlling the unmanned aerial vehicle to land on a ship.

Description

Unmanned aerial vehicle autonomous landing method and system
Technical Field
The invention relates to the technical field of unmanned flight, in particular to an unmanned aerial vehicle autonomous landing method and system.
Background
The successful landing of the unmanned aerial vehicle on the ship is a very complex control task, and the ship can move all the time and the speed is not fixed, so that the autonomous landing difficulty of the unmanned aerial vehicle is very high. Through development and research for many years, at present, autonomous landing navigation systems applied to domestic and foreign unmanned aerial vehicles mainly comprise navigation systems such as a Global Positioning System (GPS), an Inertial Navigation System (INS), a photoelectric navigation system, a visual navigation system, a combined navigation system and the like. The existing common unmanned aerial vehicle autonomous landing system and method have the defect of precision in the landing process due to monocular vision.
Therefore, it is necessary to develop a method and a system for autonomous landing of an unmanned aerial vehicle to improve the accuracy of the unmanned aerial vehicle in the autonomous landing process.
Disclosure of Invention
The invention provides an unmanned aerial vehicle autonomous landing method and system, which improve the accuracy of unmanned aerial vehicle autonomous landing through a fusion positioning algorithm of a plurality of sensors.
The invention provides an unmanned aerial vehicle autonomous landing method, which comprises the following steps: a mode judging step, namely judging whether the unmanned aerial vehicle is in a task mode or a target searching mode in real time; a waypoint moving step, namely controlling the unmanned aerial vehicle to navigate to a designated landing area according to a route and switching to a target searching mode when the unmanned aerial vehicle is in the task mode; identifying a target, namely controlling the unmanned aerial vehicle to search a correct carrier landing identifier when the unmanned aerial vehicle is in the target searching mode; guiding carrier landing, and controlling the unmanned aerial vehicle to gradually approach the carrier landing identifier according to a fusion algorithm; and a landing step, controlling the unmanned aerial vehicle to land on a ship.
And further, the method also comprises a real-time monitoring step, wherein the pose state and the carrier landing state of the unmanned aerial vehicle are monitored in real time.
Further, in the waypoint moving step: and after the unmanned aerial vehicle reaches one waypoint, judging whether the waypoint is the last waypoint or not, and if so, switching the unmanned aerial vehicle to the target searching mode.
Further, the step of identifying the target comprises the steps of: the method comprises the steps of image acquisition, wherein a carrier landing image is acquired and acquired by a downward-looking camera of the unmanned aerial vehicle; the identification detection step is to obtain a carrier landing identification, wherein the carrier landing identification is obtained by the unmanned aerial vehicle performing image processing detection on the carrier landing image; the identification judgment step is used for judging whether the landing identification is the same as a preset correct identification or not, and if so, the landing guiding step is executed; if not, updating the image, re-detecting whether the correct landing identification exists in the image, and returning to the image acquisition step; if the correct landing identification cannot be detected after a period of time, controlling the unmanned aerial vehicle to rise to obtain an image capable of detecting the correct landing identification.
Further, in the step of guiding carrier landing, when the relative position of the unmanned aerial vehicle and the carrier landing identifier is smaller than a set value, the step of carrier landing is executed.
Further, in the step of guiding carrier landing, when the unmanned aerial vehicle loses the carrier landing identifier in the process of gradually approaching the carrier landing identifier, the step of identifying the target is executed.
Further, the step of guiding carrier landing comprises the following steps: a data acquisition step, in which the raw data of each sensor is acquired; a data conversion step of converting the original data into a relative coordinate system; a data checking step of checking validity of the converted data; a step of obtaining a final state estimation value, which is to obtain a next state estimation value of the sensor through a daughter Kalman filter and calculate the final state estimation value in a main filter according to the data distribution coefficient of each sub-filter;
further, the step of guiding carrier landing further comprises a step of periodically correcting the INS, and an error value between the final state estimation value and an INS output value is used for periodically correcting the INS.
Further, in the step of guiding carrier landing, after an estimated value is obtained through the fusion algorithm, the moving direction of the unmanned aerial vehicle is judged through a symbol, a micro step length of the unmanned aerial vehicle in one direction is obtained through a PID (proportion integration differentiation) regulation and control principle, and then the unmanned aerial vehicle is moved in the direction by the micro step length until the unmanned aerial vehicle reaches a landing standard, and the step of landing is executed.
The invention also provides an unmanned aerial vehicle autonomous landing system, which adopts any unmanned aerial vehicle autonomous landing method.
The invention has the advantages that the unmanned aerial vehicle autonomous landing method and the unmanned aerial vehicle autonomous landing system are provided, and the accuracy of the unmanned aerial vehicle autonomous landing is improved through the fusion positioning algorithm of a plurality of sensors.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a flowchart of an autonomous landing method of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a block diagram of an autonomous landing system of an unmanned aerial vehicle in an embodiment of the present invention;
FIG. 3 is a block diagram of an information delivery system in accordance with an embodiment of the present invention;
FIG. 4 is a block diagram of the hardware on board the drone, in accordance with an embodiment of the present invention;
FIG. 5 is a hardware block diagram of a ground monitoring system in accordance with an embodiment of the present invention;
FIG. 6 is a functional block diagram of an airborne system and a ground monitoring system in accordance with an embodiment of the present invention;
FIG. 7 is an overall framework diagram of a multi-sensor fusion algorithm in accordance with an embodiment of the present invention;
FIG. 8 is a flow chart illustrating the application of the x-direction position controller and the PID controller according to an embodiment of the invention.
Detailed Description
The invention is further illustrated by the following figures and examples. It should be understood that the specific implementation examples described herein are only for the purpose of illustrating the invention and are not to be construed as limiting the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Fig. 1 is a flow chart of an autonomous landing method of an unmanned aerial vehicle.
As shown in fig. 1, in an embodiment of the present invention, an autonomous landing method for an unmanned aerial vehicle mainly includes the following steps: the method comprises the steps of mode judgment, waypoint movement, target identification, carrier landing guidance and carrier landing.
Specifically, when the unmanned aerial vehicle is going to autonomously land a ship, a mode judgment step is first entered, as shown in fig. 6, a mode switcher in an airborne system of the unmanned aerial vehicle is used for switching a working mode of the unmanned aerial vehicle, so as to achieve seamless connection of modes in the autonomous landing process, an initial mode of the unmanned aerial vehicle is a task mode, when the unmanned aerial vehicle is in the task mode, a waypoint moving step is entered, the unmanned aerial vehicle is controlled to navigate to reach a designated landing area according to a route formed by connecting specified waypoints, wherein the last waypoint is a current position of a landing mark in the landing area, after the unmanned aerial vehicle reaches one waypoint, whether the waypoint is the last waypoint or not is judged, if yes, the unmanned aerial vehicle is switched to a target search mode, and if not, the unmanned aerial vehicle is switched to the task.
When the unmanned aerial vehicle sails to the last navigation point and is switched to a target searching mode, the step of identifying the target is entered, the unmanned aerial vehicle is controlled to search for the correct carrier landing identification, the unmanned aerial vehicle identifies the carrier landing identification by shooting images through an onboard camera, if the correct carrier landing identification is identified, enabling the unmanned aerial vehicle to enter a step of guiding carrier landing, if the correct carrier landing identification is not identified, reading current ship GPS data transmitted by a ground monitoring system through a data transceiver of an unmanned aerial vehicle airborne system as shown in figure 6 to update the correct carrier landing identification, and the unmanned aerial vehicle is readjusted to the updated landing identification, and then the identification is continued until the correct landing identification is found, in the identification detection process, if the correct carrier landing identification cannot be detected after a period of time, the unmanned aerial vehicle is controlled to rise to obtain an image capable of detecting the correct carrier landing identification.
After the unmanned aerial vehicle enters the step of guiding carrier landing, the unmanned aerial vehicle is controlled to gradually approach the carrier landing identifier according to the fusion algorithm. In the process that the unmanned aerial vehicle gradually approaches the landing mark, if the landing mark is not lost all the time, the unmanned aerial vehicle enters the landing step according to a fusion algorithm when the relative position of the unmanned aerial vehicle and the landing mark is smaller than a set value, and the unmanned aerial vehicle reenters the target identification step when the unmanned aerial vehicle loses the landing mark in the process of gradually approaching the landing mark, wherein the part of the fusion algorithm is explained in detail in the following.
After the unmanned aerial vehicle enters the landing step, the unmanned aerial vehicle is controlled to land, and thus the autonomous landing process of the whole unmanned aerial vehicle is finished.
However, in the whole autonomous landing process of the unmanned aerial vehicle, in order to ensure safety, the unmanned aerial vehicle monitoring system further comprises a real-time monitoring step, when all the steps are in progress, the pose state and the landing state of the unmanned aerial vehicle can be monitored in real time, when the unmanned aerial vehicle is judged to need manual control, the unmanned aerial vehicle is enabled to receive manual control, and at the moment, a worker can input commands and data through a keyboard to achieve manual control of the unmanned aerial vehicle.
The fusion algorithm of the present invention will be described in detail below, and fig. 7 is an overall frame diagram of the multi-sensor fusion algorithm.
The fusion algorithm of the invention comprises the following steps: the method comprises the steps of data acquisition, data conversion, data inspection, final state estimation value acquisition and INS periodic correction, wherein the data acquisition step is to acquire raw data of each sensor, the data conversion step is to convert the raw data into a relative coordinate system through coordinates, the data inspection step is to perform validity inspection on the converted data, the final state estimation value acquisition step is to obtain a next state estimation value of the sensor through a sub-Kalman filter, the final state estimation value is calculated in a main filter according to data distribution coefficients of each sub-filter, and the INS periodic correction step is to use an error value between the final state estimation value and an INS output value to periodically correct the INS. The steps will be described in detail with reference to the following numerical expressions.
Since the validity of each sensor data changes in the process that the unmanned aerial vehicle approaches the landing identifier, in order to ensure the reliability of the result, after each sensor data after coordinate conversion is acquired through the data acquisition step and the data conversion step, the chi-square test based on the mahalanobis distance between the predicted value acquired by the INS and the observed value of each sensor is adopted to judge whether the data reaches the fusion requirement, as shown in fig. 7, the predicted value is acquired according to the following formula at first.
Equation 1: zr(k)=Hi(k)Xr(k)
Wherein Xr(k) Is the predicted value of the state at the time k calculated by the calibrated INS, and then the covariance matrix of the predicted value and the measured value of each sensor is calculated
Figure BDA0002352464240000051
And the squared Mahalanobis distance gamma between themi(k):
Equation 2:
Figure BDA0002352464240000052
equation 3:
Figure BDA0002352464240000053
equation 4:
Figure BDA0002352464240000054
wherein
Figure BDA0002352464240000055
Is a covariance matrix, P, of predicted values and true values at the ith sensor time ki(k-1) is a covariance matrix of the estimated value and the true value at the moment of the ith sensor (k-1), MkMahalanobis distance, which follows a chi-square distribution with a degree of freedom of 4 or 1, and a degree of freedomIs determined by the number of independent data observed by the sensor. Obviously, γi(k) The smaller the value, the closer the data collected by the sensor is to the data calculated by the reference system, i.e., the more trustworthy the data is, the significance level α is selectediIt is a small value, and the correspondent chi-square value can be obtained by looking up chi-square distribution table
Figure BDA0002352464240000056
Then the conditions for determining whether data is available for fusion are as follows:
equation 5:
Figure BDA0002352464240000057
wherein Pr [. C]Is a probability representing the occurrence of an event, γi(k) If the probability greater than the critical value is less than the significant level, the sensor information is considered to be available for fusion, otherwise, prediction of the next moment of the corresponding sub-filter is directly entered.
Whether the sensor information can participate in the final fusion or not, the data of the sensor information needs to be subjected to time updating and state updating through a sub-filter, so that the data can be required to be checked next time. Calculating a sub-filter Kalman gain matrix according to formula 2 and formula 3:
equation 6:
Figure BDA0002352464240000058
state update and time update:
equation 7: xi(k)=Xr(k)+Ki(k)(Zi(k)-Zr(k))
Equation 8:
Figure BDA0002352464240000059
in order to enable the fusion algorithm to have self-adaptation and enable the finally output state to be close to the accurately measured sensor data, the invention adopts a combined Kalman information distribution coefficient calculation method based on characteristic values, and the method can adaptively adjust the information distribution proportion, thereby improving the global filtering precision and specifically comprising the following steps:
solving for Pi(k) Characteristic value, recorded as
Figure BDA0002352464240000061
Representing the mth eigenvalue of the ith sub-filter, computing the sum of squared eigenvalues:
equation 9:
Figure BDA0002352464240000062
wherein n ismThe number of characteristic values is indicated. Then solving the information distribution coefficient of each sub-filter:
equation 10:
Figure BDA0002352464240000063
wherein n isiNumber of sensors, βi(k) And information distribution coefficients representing the ith sub-filter. And finally solving the fused state:
equation 11:
Figure BDA0002352464240000064
because the INS can obtain a relatively accurate system state value in a short time, but due to accumulation and drift of measurement errors, the reliability of data of the INS is rapidly attenuated after long-time work, so that in order to ensure that reference system data has real and reliable reference, the INS needs to be corrected by using the fused estimated value, and the correction amount is as follows:
equation 12: x (k) ═ xf(k)-Xr(k)
FIG. 8 is a flow chart of the application of the x-direction position controller and the PID controller. After the estimated value in the x direction is output through the fusion algorithm, the unmanned aerial vehicle reads the estimated value in the x direction output by the fusion algorithm, the moving direction of the unmanned aerial vehicle is judged through symbols, then according to the PID control principle, the estimated value can be regarded as the deviation between a controlled object and a target, and the deviation value is ex(t), which is taken as input into the following equation:
equation 13: u (t) ═ Kp[e(t)+1/Ti∫e(t)dt+Td×de(t)/dt]
Wherein KpIs a proportionality coefficient; t isiIs an integration time constant; t isdIs the differential time constant. The corresponding output u can be obtained by equation 13x(t), the output is the tiny step x in the x direction. Adding x into the control quantity i, enabling the unmanned aerial vehicle to move x in the positive/negative direction of x through the MAVROS, reading the estimated value i again at the moment to judge whether the landing standard is met, substituting new i into the PID regulator if the landing standard is not met, repeating the steps until the landing standard is met, then descending according to a fixed value z, continuously judging whether the landing standard is deviated or not, reading the value x again to continue PID regulation if the landing standard is deviated, readjusting x to be within the standard range, and then continuing landing until the unmanned aerial vehicle finishes landing.
In order to use the above-mentioned autonomous landing method for an unmanned aerial vehicle, in an embodiment of the present invention, an autonomous landing system for an unmanned aerial vehicle is further provided, fig. 2 is a block diagram of the autonomous landing system for an unmanned aerial vehicle, as shown in fig. 2, the system mainly includes 6 large modules, such as a central processing module, a visual positioning module, a wireless positioning and transmission module, a GPS/INS module, a flight control module, and a ground monitoring module, which will be described in detail below.
The central processing module is the core of the system information processing and comprises an image microprocessor, an external controller, an information fusion device and a main controller. The main controller coordinates tasks such as information exchange, working frequency and starting time in other 3 processing periods, the image microprocessor is used for acquiring position information of the landing mark, transmitting the information to the information fusion device to be fused with data of other sensors, estimating position information of the current position of the unmanned aerial vehicle relative to a ship, which is superior to single vision positioning, and the external controller sends a corresponding instruction to the flight control module by using the position estimation value, so that the target function of the whole unmanned aerial vehicle autonomous landing system is realized.
The vision positioning module is used for obtaining images and processing the images and transmitting the images to the central processing module, and the vision positioning module comprises: an image microprocessor, a high resolution monocular camera, and a high resolution image transfer. The image microprocessor comprises an image preprocessor, a target separator, a position calculator and a template matcher. The image microprocessor respectively carries out image preprocessing and target separation operations on the images acquired by the onboard high-resolution monocular camera through the image preprocessor and the target separator so as to acquire target images which are more convenient to process, wherein the target refers to the landing identifier. After a target image which is more convenient to process is obtained, the actual position coordinates of the target are obtained through the position calculator and the template matcher respectively, the correctness of the target is determined, and finally the position data are packaged and uploaded to the main controller. Meanwhile, the original color image acquired by the high-resolution monocular camera and the image after recognition processing are transmitted to a ground monitoring system for real-time monitoring through high-resolution image transmission.
The wireless positioning and transmission module is used for acquiring the position coordinates of the unmanned aerial vehicle relative to the ship landing mark, executing a fusion algorithm and detecting the pose state of the unmanned aerial vehicle. The wireless location and transmission module includes: the wireless positioning system comprises an infinite distance measuring module, a wireless positioning processor and a wireless transmission module, wherein the infinite distance measuring module is respectively arranged at two positions of an unmanned aerial vehicle and a ship ground station, the unmanned aerial vehicle is provided with a label node carrying the wireless distance measuring module, the ground station is provided with a base station node of the wireless distance measuring module, each node can obtain the distance between the node and the base station node or the label node, the label node carries out certain algorithm fusion on the distance data between the label node and each base station node through a wireless positioning microprocessor, the position coordinate of the current unmanned aerial vehicle relative to a ship landing target is obtained, and the current position coordinate is uploaded to an information fusion device to be fused with other position data. Meanwhile, the position data and related data in other subsystems are uploaded to a ground monitoring system through a wireless transmission module for real-time monitoring, and the pose state of the unmanned aerial vehicle is detected constantly.
The GPS/INS module is used as a reference system of the whole system, data of the GPS/INS module can be used as a part of fusion data and used for improving the precision and reliability of the fusion estimation data, and meanwhile, the GPS data and the INS data provided by the system can serve for starting, unlocking and attitude adjustment of the flight control module.
The flight control module is an actuator of the system, comprising: the system comprises an external controller, a flight control module and a servo subsystem, wherein the flight control module is a system core. In order to reduce the complexity of system design, the embodiment uses a commercially mature flight control module, and controls the flight control module through a control mode sub-mode below the flight control module, that is, an off-board control mode uses a special communication protocol, specifically, an external controller in a central processing module converts unmanned aerial vehicle position information estimated by a fusion device into command information for controlling an attitude angle of the unmanned aerial vehicle, and after receiving a standard control command, the flight control module drives a servo subsystem to control a motor, so that the unmanned aerial vehicle adjusts the attitude to approach a landing ship identifier.
The ground monitoring module is the effective instrument that the staff knows unmanned aerial vehicle running state in real time, is the another kind of guarantee that unmanned aerial vehicle safe landing, and according to the project demand, this module includes: the wireless positioning and transmission module comprises an upper computer display system and a wireless ranging module, wherein the wireless ranging module is a base station node of the wireless ranging module explained in the wireless positioning and transmission module. The upper computer display system mainly displays data transmitted by the wireless transmission module, and the upper computer display system mainly comprises state information such as an original image and a binary image comparison video, the speed pose of the unmanned aerial vehicle and the like, sensor data and fusion data information in the data fusion process and the like. Meanwhile, the system can have auxiliary functions of parameter setting, historical data storage, manual/automatic control switching and the like as required.
As described above, there are many data and information transmissions in the whole unmanned aerial vehicle autonomous landing system, and the information transmission of the whole system will be described in detail below, and fig. 3 is a block diagram of the information transmission system.
As shown in fig. 3, in an embodiment of the present invention, an information transmission system mainly includes: the sensing layer, the data domain transmission layer, the storage layer, the interaction layer and other functional layers, which will be described in detail below.
The sensing layer is the basis of the whole information transmission system and comprises a monocular camera, an IMU, a GPS, a UWB and a keyboard and the like. When the unmanned aerial vehicle is gradually close to the landing mark, the monocular camera arranged below can acquire an original image with the landing mark in real time. The IMU is used as a built-in sensor and can output linear acceleration and angular acceleration of the unmanned aerial vehicle in three mutually perpendicular directions. The GPS is used for receiving longitude and latitude and height information of the unmanned aerial vehicle and the ship in the current earth coordinate system sent by the satellite. UWB can measure the distance between the label node on the unmanned plane and the base station node on the ship. In order to ensure the system safety, the staff can input command data through a keyboard to realize manual control on the unmanned aerial vehicle.
In the data and transmission layer, some original data collected by the sensing layer can be used only after being transmitted to other modules for processing, such as image data, UWB (ultra wide band) collected distance data and the like, while some data can be directly used or indirectly used, namely, the data is not required to be processed by a single module, such data as linear/angular acceleration, longitude and latitude, height and the like, and some data are manually customized command data, such data has low frequency and small data volume.
The data are transmitted to the designated module and then used or stored by the application and storage layer, the original image data of the system are subjected to image recognition and processing to obtain position data to participate in algorithm fusion, and the processed image data and the original image are stored in an image database in the ground station system together. Distance data of the UWB modules are processed through a positioning algorithm to obtain coordinate data, the coordinate data, GPS data, IMU data and position data obtained after image processing are transmitted into a multi-sensor fusion algorithm together to obtain final position estimation data, the final position estimation data are packaged according to a MavLink protocol and transmitted into a flight control module through a serial port, and the flight control module is combined with system parameters input by a keyboard in a parameter library to control the unmanned aerial vehicle.
And the interaction layer is used for developing an interaction interface based on the Qtcreator and displaying various acquired information, including images, unmanned aerial vehicle poses, system parameters, fault alarm data and the like, meanwhile, in order to consider the landing safety, a manually operated interface is added on the interaction interface, and when the unmanned aerial vehicle does not fly according to an expected route, a worker can stop an automatic landing program on the interface and enter a manual control mode.
As can be seen from the above description, in the embodiment of the present invention, some modules in the autonomous landing system of the unmanned aerial vehicle are loaded on the unmanned aerial vehicle, and some modules are located on the ground monitoring system, and details of airborne hardware and ground monitoring hardware of the unmanned aerial vehicle will be described below.
Fig. 4 is a block diagram of onboard hardware of the unmanned aerial vehicle, and as shown in fig. 4, the unmanned aerial vehicle includes a central processing module, a data acquisition module, an infinite transmission module, a flight control module, and a power module.
More specifically, the central processing module is mainly a central processing unit, and in the present embodiment, the central processing unit TX2 is preferred. The data acquisition module mainly comprises: the system comprises a Lora module, a UWB positioning module, a GPS and a flight control module, wherein the Pixhawk is preferably selected by the flight control module, the Lora module and the UWB positioning module communicate with a central processing unit through a UART serial port protocol, and the GPS, the flight control module and the central processing unit communicate with each other through a UART Mavl ink protocol. The wireless transmission module comprises a camera and a picture-transmission transmitter, the visible camera is preferably Sony FCB-EV7500/7520, the visible camera is communicated with the central processing unit through an LVDS interface, and the picture-transmission transmitter is communicated with the central processing unit through MiniHDMI. The power module comprises a motor and an electric regulator, and the power module provides different voltages of 5V/12V/24V and the like through the distribution board to be used by other modules.
Fig. 5 is the hardware block diagram of ground monitoring module, as shown in fig. 5, in this embodiment, this module uses STM32F103RCT6 as the main chip, receive the range finding data that four UWB basic stations transmitted respectively through the serial ports, simultaneously in the normal course of working of unmanned aerial vehicle, the lora module can constantly receive the unmanned aerial vehicle position appearance data that unmanned aerial vehicle airborne system transmitted, UWB locating data, the visual positioning data, fuse estimation position data etc, the main chip will be above data information and send industrial control host computer according to TCP/IP protocol packing, then show by user interface. And the command and configuration parameter data of the monitoring platform host are transmitted to the main chip in a TCP/IP data packet mode for analysis and are sent to the unmanned aerial vehicle through lora. The system does not need redundant configuration power supply, and the used 5V voltage can be provided by the industrial control host.
Fig. 6 is a functional block diagram of an airborne system and a floor monitoring module, as shown in fig. 6, including: the mode switcher is used for switching the working modes of the unmanned aerial vehicle and realizing seamless connection of the modes of the unmanned aerial vehicle in the autonomous carrier landing process; the position and direction controller controls the accelerator of the unmanned aerial vehicle and the pitching, rolling and yawing motions to realize the change of the position and the speed of the unmanned aerial vehicle; the PID controller continuously adjusts the moving speed of the unmanned aerial vehicle according to the current position, realizes accurate control on the landing point of the unmanned aerial vehicle, and improves the robustness of the system; the data transceiver coordinates information exchange between the two systems, including data receiving, data checking, data analyzing, data packaging, data sending and the like, and ensures the safety and reliability of data transmission; and the fault processor provides a safety guarantee measure in case of system burst or abnormal condition.
In conclusion, by using the unmanned aerial vehicle autonomous landing method and system provided by the invention, the accuracy of the unmanned aerial vehicle autonomous landing is improved through the fusion positioning algorithm of a plurality of sensors.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions and improvements made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An unmanned aerial vehicle autonomous landing method is characterized by comprising the following steps:
a mode judging step, namely judging whether the unmanned aerial vehicle is in a task mode or a target searching mode in real time;
a waypoint moving step, namely controlling the unmanned aerial vehicle to navigate to a designated landing area according to a route and switching to a target searching mode when the unmanned aerial vehicle is in the task mode;
identifying a target, namely controlling the unmanned aerial vehicle to search a correct carrier landing identifier when the unmanned aerial vehicle is in the target searching mode;
guiding carrier landing, and controlling the unmanned aerial vehicle to gradually approach the carrier landing identifier according to a fusion algorithm; and
and a landing step, controlling the unmanned aerial vehicle to land on a ship.
2. The autonomous landing method for unmanned aerial vehicles according to claim 1, further comprising a real-time monitoring step of monitoring the pose state and landing state of the unmanned aerial vehicle in real time.
3. The autonomous landing method for unmanned aerial vehicles according to claim 1, wherein in the waypoint moving step:
when the unmanned aerial vehicle reaches a waypoint, judging whether the waypoint is the last waypoint or not,
and if so, switching the unmanned aerial vehicle to the target searching mode.
4. The autonomous landing method of unmanned aerial vehicles according to claim 1,
the step of identifying the target comprises the steps of:
the method comprises the steps of image acquisition, wherein a carrier landing image is acquired and acquired by a downward-looking camera of the unmanned aerial vehicle;
the identification detection step is to obtain a carrier landing identification, wherein the carrier landing identification is obtained by the unmanned aerial vehicle performing image processing detection on the carrier landing image; and
an identification judgment step, namely judging whether the landing identification is the same as a preset correct identification or not, and if so, executing the step of guiding landing; if not, updating the image, re-detecting whether the correct landing identification exists in the image, and returning to the image acquisition step;
if the correct landing identification cannot be detected after a period of time, controlling the unmanned aerial vehicle to rise to obtain an image capable of detecting the correct landing identification.
5. The autonomous landing method for unmanned aerial vehicles according to claim 1, wherein in the landing guiding step, when the relative position of the unmanned aerial vehicle and the landing indicator is less than a set value, the landing step is performed.
6. The autonomous landing method for unmanned aerial vehicles according to claim 1, wherein in the landing guiding step, the target identifying step is performed when the unmanned aerial vehicle loses the landing indicator in the process of gradually approaching the landing indicator.
7. The autonomous landing method of unmanned aerial vehicles according to claim 1,
the step of guiding carrier landing comprises the following steps:
a data acquisition step, in which the raw data of each sensor is acquired;
a data conversion step of converting the original data into a relative coordinate system;
a data checking step of checking validity of the converted data;
and a step of obtaining a final state estimation value, namely obtaining a next state estimation value of the sensor through a daughter Kalman filter, and calculating the final state estimation value in the main filter according to the data distribution coefficient of each sub-filter.
8. The autonomous landing method for unmanned aerial vehicles according to claim 7, further comprising an INS periodic correction step in the landing guiding step, wherein the INS is periodically corrected by using an error value between the final state estimation value and an INS output value.
9. The autonomous landing method for unmanned aerial vehicles according to claim 7, wherein in the landing guiding step, after the estimated value is obtained through the fusion algorithm, the moving direction of the unmanned aerial vehicle is judged through a symbol, then the micro step length of the unmanned aerial vehicle in one direction is obtained through a PID control principle, and then the unmanned aerial vehicle is moved in the direction by the micro step length until the unmanned aerial vehicle reaches a landing standard, and then the landing step is executed.
10. An unmanned aerial vehicle autonomous landing system, which adopts the unmanned aerial vehicle autonomous landing method as claimed in any one of claims 1 to 9.
CN201911421348.7A 2019-12-31 2019-12-31 Unmanned aerial vehicle autonomous landing method and system Pending CN111679680A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911421348.7A CN111679680A (en) 2019-12-31 2019-12-31 Unmanned aerial vehicle autonomous landing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911421348.7A CN111679680A (en) 2019-12-31 2019-12-31 Unmanned aerial vehicle autonomous landing method and system

Publications (1)

Publication Number Publication Date
CN111679680A true CN111679680A (en) 2020-09-18

Family

ID=72451204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911421348.7A Pending CN111679680A (en) 2019-12-31 2019-12-31 Unmanned aerial vehicle autonomous landing method and system

Country Status (1)

Country Link
CN (1) CN111679680A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112269399A (en) * 2020-11-06 2021-01-26 北京理工大学 Active recovery control method and device applied to unmanned aerial vehicle
CN114115345A (en) * 2021-11-19 2022-03-01 中国直升机设计研究所 Visual landing guiding method and system for rotor unmanned aerial vehicle
CN114384929A (en) * 2021-12-02 2022-04-22 上海航天控制技术研究所 Unmanned cluster formation control method based on deviation optimization heuristic algorithm
CN114442663A (en) * 2022-01-12 2022-05-06 苏州大学 Multi-working-mode automatic cruise unmanned aerial vehicle system
CN116719227A (en) * 2023-08-11 2023-09-08 北京瞭望神州科技有限公司 Protection method and protection system applied to intelligent security scene of cultivated land

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353378A (en) * 2011-09-09 2012-02-15 南京航空航天大学 Adaptive federal filtering method of vector-form information distribution coefficients
CN103700286A (en) * 2013-12-11 2014-04-02 南京航空航天大学 Automatic carrier-landing guiding method of carrier-borne unmanned aircraft
CN104590576A (en) * 2014-12-04 2015-05-06 南京航空航天大学 Flight control system and method for ship-borne unmanned aerial vehicle autonomous landing
CN104932522A (en) * 2015-05-27 2015-09-23 深圳市大疆创新科技有限公司 Autonomous landing method and system for aircraft
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN109270953A (en) * 2018-10-10 2019-01-25 大连理工大学 A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on concentric circles visual cues
CN109911231A (en) * 2019-03-20 2019-06-21 武汉理工大学 Unmanned plane autonomous landing on the ship method and system based on GPS and image recognition hybrid navigation
CN110032201A (en) * 2019-04-19 2019-07-19 成都飞机工业(集团)有限责任公司 A method of the airborne visual gesture fusion of IMU based on Kalman filtering
CN110081881A (en) * 2019-04-19 2019-08-02 成都飞机工业(集团)有限责任公司 It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique
CN110546459A (en) * 2017-02-08 2019-12-06 马凯特大学 Robot tracking navigation with data fusion

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353378A (en) * 2011-09-09 2012-02-15 南京航空航天大学 Adaptive federal filtering method of vector-form information distribution coefficients
CN103700286A (en) * 2013-12-11 2014-04-02 南京航空航天大学 Automatic carrier-landing guiding method of carrier-borne unmanned aircraft
CN104590576A (en) * 2014-12-04 2015-05-06 南京航空航天大学 Flight control system and method for ship-borne unmanned aerial vehicle autonomous landing
CN104932522A (en) * 2015-05-27 2015-09-23 深圳市大疆创新科技有限公司 Autonomous landing method and system for aircraft
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN110546459A (en) * 2017-02-08 2019-12-06 马凯特大学 Robot tracking navigation with data fusion
CN109270953A (en) * 2018-10-10 2019-01-25 大连理工大学 A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on concentric circles visual cues
CN109911231A (en) * 2019-03-20 2019-06-21 武汉理工大学 Unmanned plane autonomous landing on the ship method and system based on GPS and image recognition hybrid navigation
CN110032201A (en) * 2019-04-19 2019-07-19 成都飞机工业(集团)有限责任公司 A method of the airborne visual gesture fusion of IMU based on Kalman filtering
CN110081881A (en) * 2019-04-19 2019-08-02 成都飞机工业(集团)有限责任公司 It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程隽逸: "基于多传感器融合的舰载无人机自动着舰技术的研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112269399A (en) * 2020-11-06 2021-01-26 北京理工大学 Active recovery control method and device applied to unmanned aerial vehicle
CN114115345A (en) * 2021-11-19 2022-03-01 中国直升机设计研究所 Visual landing guiding method and system for rotor unmanned aerial vehicle
CN114384929A (en) * 2021-12-02 2022-04-22 上海航天控制技术研究所 Unmanned cluster formation control method based on deviation optimization heuristic algorithm
CN114384929B (en) * 2021-12-02 2023-09-12 上海航天控制技术研究所 Unmanned cluster formation control method based on deviation optimization heuristic algorithm
CN114442663A (en) * 2022-01-12 2022-05-06 苏州大学 Multi-working-mode automatic cruise unmanned aerial vehicle system
CN116719227A (en) * 2023-08-11 2023-09-08 北京瞭望神州科技有限公司 Protection method and protection system applied to intelligent security scene of cultivated land
CN116719227B (en) * 2023-08-11 2023-10-24 北京瞭望神州科技有限公司 Protection method and protection system applied to intelligent security scene of cultivated land

Similar Documents

Publication Publication Date Title
CN111679680A (en) Unmanned aerial vehicle autonomous landing method and system
US11218689B2 (en) Methods and systems for selective sensor fusion
CN104503466B (en) A kind of Small and micro-satellite guider
CN111338383B (en) GAAS-based autonomous flight method and system, and storage medium
AU2007360479B2 (en) System for the precision localization of a target on the ground by a flying platform and associated method of operation
CN113124856B (en) Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN111947644B (en) Outdoor mobile robot positioning method and system and electronic equipment thereof
CN111123964A (en) Unmanned aerial vehicle landing method and device and computer readable medium
CN113156998A (en) Unmanned aerial vehicle flight control system and control method
US20210216071A1 (en) Mapping and Control System for an Aerial Vehicle
EP4269044A1 (en) Slope location correction method and apparatus, robot and readable storage medium
CN114721441A (en) Multi-information-source integrated vehicle-mounted unmanned aerial vehicle autonomous landing control method and device
WO2021223122A1 (en) Aircraft positioning method and apparatus, aircraft, and storage medium
CN105874352B (en) The method and apparatus of the dislocation between equipment and ship are determined using radius of turn
CN113946157A (en) Fixed-point unmanned aerial vehicle landing method and system based on multifunctional identification and positioning
CN211012986U (en) Unmanned autonomous cruise vehicle navigation system based on inertial navigation technology
Troll et al. Indoor Localization of Quadcopters in Industrial Environment
CN110785722A (en) Parameter optimization method and device for mobile platform, control equipment and aircraft
CN117451034B (en) Autonomous navigation method and device, storage medium and electronic equipment
CN114384932B (en) Unmanned aerial vehicle navigation docking method based on distance measurement
CN117092625B (en) External parameter calibration method and system of radar and combined inertial navigation system
JP2019148456A (en) Calculation device, self-location calculation method and program
CN116540754A (en) Unmanned aerial vehicle unmanned pilot flight method, device and system with fixed wing and readable storage medium
Askari Flight Control and Navigation for Multi Rotor UAV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200918

RJ01 Rejection of invention patent application after publication