CN110687928A - Landing control method, system, unmanned aerial vehicle and storage medium - Google Patents

Landing control method, system, unmanned aerial vehicle and storage medium Download PDF

Info

Publication number
CN110687928A
CN110687928A CN201910849942.XA CN201910849942A CN110687928A CN 110687928 A CN110687928 A CN 110687928A CN 201910849942 A CN201910849942 A CN 201910849942A CN 110687928 A CN110687928 A CN 110687928A
Authority
CN
China
Prior art keywords
mobile platform
unmanned aerial
aerial vehicle
target mobile
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910849942.XA
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhendi Intelligent Technology Co Ltd
Original Assignee
Suzhou Zhendi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhendi Intelligent Technology Co Ltd filed Critical Suzhou Zhendi Intelligent Technology Co Ltd
Priority to CN201910849942.XA priority Critical patent/CN110687928A/en
Publication of CN110687928A publication Critical patent/CN110687928A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The application provides a landing control method, a system, an unmanned aerial vehicle and a storage medium, wherein the method is applied to the unmanned aerial vehicle and comprises the following steps: acquiring state parameters of the unmanned aerial vehicle; acquiring state parameters of a target mobile platform, wherein a preset landing point label is arranged on the surface of the target mobile platform; planning a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and controlling the unmanned aerial vehicle to travel along the flight path so as to land on the preset landing point label of the target mobile platform. The landing control method enables the unmanned aerial vehicle to land on the target mobile platform, and therefore the application range of the unmanned aerial vehicle can be widened.

Description

Landing control method, system, unmanned aerial vehicle and storage medium
Technical Field
The application relates to the field of unmanned aerial vehicles, in particular to a landing control method, a landing control system, an unmanned aerial vehicle and a nonvolatile readable storage medium.
Background
Along with the development of society, unmanned aerial vehicle develops more and more rapidly. Unmanned aerial vehicles are becoming more and more popular whether for business use or civil use. However, the existing unmanned aerial vehicles still have many defects, for example, the existing unmanned aerial vehicles can only land on a static ground or a platform, the application range is narrow, and the increasing requirements of people on the unmanned aerial vehicles are difficult to meet.
Disclosure of Invention
In view of the above, an object of the embodiments of the present application is to provide a landing control method, a landing control system, a drone and a non-volatile storage medium, which enable the drone to land on a mobile platform.
A landing control method is applied to an unmanned aerial vehicle and comprises the following steps: acquiring state parameters of the unmanned aerial vehicle; acquiring state parameters of a target mobile platform, wherein a preset landing point label is arranged on the surface of the target mobile platform; planning a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and controlling the unmanned aerial vehicle to travel along the flight path so as to land on the preset landing point label of the target mobile platform.
This application is through the state parameter who obtains unmanned aerial vehicle and the state parameter of target moving platform, then according to the state parameter of this unmanned aerial vehicle and the state parameter planning of target moving platform descend to the flight path of the target moving platform's the predetermined landing point label, then control this unmanned aerial vehicle flight descending for unmanned aerial vehicle can descend on target moving platform, thereby can widen unmanned aerial vehicle's range of application.
Optionally, before the obtaining of the state parameter of the target mobile platform, the method further includes: detecting the target mobile platform; and after the target mobile platform is detected, determining the position of the preset falling point label.
This application is through surveying target moving platform to confirm after detecting target moving platform and predetermine the position of landing point label, be convenient for plan the flight path of descending in-process unmanned aerial vehicle.
Optionally, the preset dropping point label includes patterns with different shapes, and the determining the preset dropping point label includes: acquiring a surface image of the target mobile platform; and identifying the preset drop point label from the surface image so as to determine the position of the preset drop point label.
When the preset falling point label comprises patterns of different shapes, the position of the preset falling point label can be accurately determined by applying the image recognition technology.
Optionally, the determining the preset landing point tag includes: receiving an ultrasonic signal transmitted by the ultrasonic transmitter; and determining the position of the preset landing point label according to the ultrasonic signal.
When predetermineeing the landing point label and include ultrasonic transmitter in this application, can more accurately confirm through tracking ultrasonic signal predetermineeing the position of landing point label.
Optionally, before the obtaining of the state parameter of the target mobile platform, the method further includes: detecting a mobile platform; and determining the mobile platform with the preset landing point label on the surface as the target mobile platform.
The target mobile platform can be found under the condition that a plurality of mobile platforms exist by detecting the mobile platform and determining the mobile platform with the preset landing point label on the surface as the target mobile platform.
Optionally, the preset falling point tag includes patterns of different shapes, and the determining, as the target moving platform, a moving platform on which the preset falling point tag is disposed includes: acquiring an image of a landing point label of a mobile platform with the landing point label on the surface; and when the image of the falling point label is determined to be matched with the pattern of the preset falling point label, determining the mobile platform provided with the falling point label as the target mobile platform.
When the preset falling point label comprises different planet-shaped patterns, the falling point label is obtained through image recognition, and is matched with the preset falling point label, so that the target moving platform can be accurately determined from a plurality of moving platforms.
Optionally, the preset landing point tag includes an ultrasonic transmitter, and the determining, as the target moving platform, a moving platform on which the preset landing point tag is disposed includes: receiving an ultrasonic signal; and tracking the ultrasonic transmitter according to the ultrasonic signal, and determining the mobile platform where the ultrasonic transmitter is located as the target mobile platform.
When predetermineeing the landing point label and include ultrasonic emitter in this application, through pursuit ultrasonic emitter, can realize determining target moving platform more accurately from a plurality of moving platform.
Optionally, the obtaining the state parameter of the target mobile platform includes: calculating the motion parameters of the preset landing point labels relative to the unmanned aerial vehicle; and determining the motion parameters of the target mobile platform according to the motion parameters of the preset landing point tag relative to the unmanned aerial vehicle.
Optionally, the planning, according to the state parameter of the unmanned aerial vehicle and the state parameter of the target mobile platform, a flight path where the unmanned aerial vehicle lands from the position where the unmanned aerial vehicle is located to the preset landing point tag includes: calculating relative motion parameters of the unmanned aerial vehicle and the target mobile platform according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and planning a flight path of the unmanned aerial vehicle from the landing to the preset landing point label according to the relative motion parameters.
Optionally, the controlling the drone to travel along the flight path to land on the preset landing point tag of the target mobile platform includes: adjusting the motion parameters of the unmanned aerial vehicle in real time in the process of controlling the unmanned aerial vehicle to travel along the flight path; when the unmanned aerial vehicle travels to a distance, the preset height of the preset landing point label is preset, and the current state parameter of the unmanned aerial vehicle is matched with the current state parameter of the target mobile platform, the unmanned aerial vehicle is controlled to land on the preset landing point label of the target mobile platform.
A landing control system applied to an unmanned aerial vehicle comprises: the detection module is used for acquiring the state parameters of the unmanned aerial vehicle and the state parameters of a target mobile platform, and a preset landing point label is arranged on the surface of the target mobile platform; the processing module is used for planning a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and the control module is used for controlling the unmanned aerial vehicle to travel along the flight path so as to land on the preset landing point label of the target mobile platform.
Optionally, the detection module is further configured to detect the target moving platform, and determine the position of the preset landing point tag after the target moving platform is detected.
Optionally, the preset falling point tag includes patterns of different shapes, and the detection module is configured to acquire a surface image of the target mobile platform, and identify the preset falling point tag from the surface image to determine a position of the preset falling point tag.
Optionally, the preset landing point tag includes an ultrasonic transmitter, and the detection module is configured to receive an ultrasonic signal transmitted by the ultrasonic transmitter, and determine the position of the preset landing point tag according to the ultrasonic signal.
Optionally, the detection module is further configured to detect a mobile platform, and determine the mobile platform with the preset landing point tag on the surface as the target mobile platform.
Optionally, the preset falling point tag includes patterns of different shapes, and the detection module is further configured to obtain an image of the falling point tag of a mobile platform on which the falling point tag is disposed, and determine the mobile platform on which the falling point tag is disposed as the target mobile platform when it is determined that the image of the falling point tag matches the pattern of the preset falling point tag.
Optionally, the preset landing point tag includes an ultrasonic transmitter, and the detection module is further configured to receive an ultrasonic signal, track the ultrasonic transmitter according to the ultrasonic signal, and determine the mobile platform where the ultrasonic transmitter is located as the target mobile platform.
Optionally, the state parameter of the target mobile platform includes a motion parameter of the target mobile platform, and the detection module includes: the calculating unit is used for calculating the motion parameters of the landing point label relative to the unmanned aerial vehicle; and the determining unit is used for determining the motion parameters of the target mobile platform according to the motion parameters of the landing point tag relative to the unmanned aerial vehicle.
Optionally, the processing module is configured to calculate a relative state parameter between the unmanned aerial vehicle and the target mobile platform according to the state parameter of the unmanned aerial vehicle and the state parameter of the target mobile platform, and plan a flight path of the unmanned aerial vehicle from a position where the unmanned aerial vehicle is located to the preset landing point tag according to the relative state parameter.
Optionally, the control module is further configured to adjust a state parameter of the drone in real time during the process of controlling the drone to travel along the flight path; and when the unmanned aerial vehicle travels to a distance to the preset height of the preset landing point label and the current state parameter of the unmanned aerial vehicle is matched with the current state parameter of the target mobile platform, controlling the unmanned aerial vehicle to land on the preset landing point label of the target mobile platform.
An electronic device comprising a memory and a processor, the memory having stored therein computer readable instructions, which, when executed by the processor, cause the processor to perform the above mentioned fall control method or to implement the functionality of the above mentioned fall control system.
A non-transitory readable storage medium storing computer readable instructions which, when executed by a processor, cause the processor to perform the landing control method or to implement the functions of the landing control system.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the application will be apparent from the description and drawings, and from the claims.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is an application scenario diagram of a landing control method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a preset landing point tag of a target mobile platform according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present application;
FIG. 4 is a flow chart of a landing control method according to an embodiment of the present application;
FIG. 5 is a block diagram of a fall control system according to an embodiment of the present application;
icon: unmanned aerial vehicle, 10; a mobile platform, 20; presetting a landing point label, 21; system bus, 11; a processor, 12; a memory, 13; GPS positioning system, 14; a sensor system, 15; monocular/binocular vision measurement systems, 16; an image acquisition system, 17; a signal transmitting unit, 18; a descent control system, 30; a detection module, 31; a calculation unit, 311; a determination unit, 312; a processing module, 32; and a control module 33.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Referring to fig. 1, an embodiment of the present application provides a landing control method applied to an unmanned aerial vehicle 10 for controlling the unmanned aerial vehicle 10 to land on a target mobile platform 20. In this embodiment, the target mobile platform 20 may be a running vehicle, a sailing ship, or the like. In one embodiment, the target mobile platform 20 may include a Global Positioning System (GPS), a sensor system, and a signal transmitting unit connected to the GPS and the sensor system. The GPS positioning system is used to generate the positioning information of the target mobile platform 20 itself in real time. The sensor system includes inertial measurement units, angle sensors, velocity sensors, accelerometers, etc. for generating parameters of motion of the target mobile platform 20 including velocity, angle, angular velocity, acceleration, etc. The signal transmitting unit is used for transmitting signals containing the positioning information and the state parameters outwards. The target moving platform 20 is provided with a preset landing point tag 21 on the surface. In the present embodiment, the surface of the target moving platform 20 refers to an open approximately horizontal surface of the target moving platform 20, such as an upper surface of a roof of a vehicle, an upper surface of a deck of a ship, and the like.
The preset drop point labels 21 include patterns of different shapes. The patterns with different shapes comprise cross patterns, circular patterns, square patterns, oval patterns, triangular patterns and the like. Referring to fig. 2, in the present embodiment, the patterns with different shapes are a cross pattern, a circular pattern and a square pattern, the circular pattern is located in the square pattern, and the cross pattern is located in the circular pattern. In one embodiment, the cross pattern, the circular pattern, and the square pattern are concentrically arranged. It is understood that the different shaped patterns are only examples and not limited thereto. In other embodiments, the different shaped patterns may also be a triangular pattern, a circular pattern and a square pattern, the circular pattern being located within the square pattern and the triangular pattern being located within the circular pattern. In this embodiment, predetermine the landing point label and can also include ultrasonic transmitter for the transmission ultrasonic signal, so that unmanned aerial vehicle fixes a position this landing point label.
The landing control method may be performed by the drone 10. Referring to fig. 3, the drone 10 includes a processor 12 and a memory 13 connected by a system bus 11. The memory 13 has stored therein computer readable instructions. The processor 12 is configured to provide computing and control capabilities and may retrieve and execute the computer readable instructions to perform the landing control methods described below. In this embodiment, the unmanned aerial vehicle 10 further includes a GPS positioning system 14, a sensor system 15, a monocular/binocular vision measuring system 16, an image acquisition system 17, and a signal receiving unit 18, which are connected to the processor 12 and the memory 13 through the data bus 11.
The GPS positioning system 14 is used to generate GPS data for the drone itself. The sensor system 15 includes an inertial measurement unit, an angle sensor, a velocity sensor, an accelerometer, etc., and the sensor system 15 is used to generate motion parameters of the drone 11 including velocity, angle, angular velocity, acceleration, etc. The monocular/binocular measuring system 16 is used to detect the target moving platform 20 and obstacles. The image capturing system 17 is used to capture the surface image of the target moving platform 20 and the image of the landing point label of the target moving platform 20 with the landing point label on the surface. The signal receiving unit 18 is configured to receive a signal containing a status parameter of the target mobile platform 20 transmitted by the signal transmitting unit of the target mobile platform 20.
Referring to fig. 4, in the present embodiment, the landing control method includes the following steps.
Step S101: and acquiring the state parameters of the unmanned aerial vehicle.
The state parameters of the unmanned aerial vehicle include the GPS data, speed, angle, angular velocity, etc. of the unmanned aerial vehicle. Unmanned aerial vehicle can acquire unmanned aerial vehicle's GPS data through this GPS positioning system, through the speed, the angle, angular velocity etc. that this sensor system generated unmanned aerial vehicle. In this embodiment, the mode that unmanned aerial vehicle acquireed the state parameter of self is acquireed for real time. The processor can predict the state parameters of the unmanned aerial vehicle at the future time based on the first preset algorithm and the state parameters of the unmanned aerial vehicle at the current time. The first preset algorithm includes at least one of EKF (Extended Kalman Filter), EKF2, EKF3, CF (collaborative filtering), VIO (Visual-inertial odometer), SVO (Semi-direct Visual odometer), LK (Lucas-Kanade optical flow algorithm), SLAM (Simultaneous Localization and Mapping).
Step S102: the method comprises the steps of obtaining state parameters of a target mobile platform, wherein a preset falling point label is arranged on the surface of the target mobile platform.
The state parameters of the target mobile platform comprise GPS data of the target mobile platform and motion parameters comprising a course angle, a pitch angle, a roll angle, a flight speed, an acceleration, an angular velocity acceleration and the like. In one embodiment, the unmanned aerial vehicle can receive a signal containing GPS data transmitted by a signal transmitting unit of the target mobile platform through a signal receiving unit to acquire the GPS data of the target mobile platform, detect the target mobile platform through a monocular/binocular vision measuring system, and acquire a motion parameter of the target mobile platform based on a computer vision algorithm. In this embodiment, the unmanned aerial vehicle can receive the real-time signal containing the GPS data transmitted by the signal transmitting unit of the target mobile platform through the signal receiving unit to acquire the real-time GPS data of the target mobile platform, so as to position the target mobile platform in real time. The unmanned aerial vehicle can acquire the motion parameters of the target mobile platform in real time based on a computer vision algorithm.
In an embodiment, the unmanned aerial vehicle may receive, by the signal receiving unit, a real-time signal containing GPS data transmitted by the signal transmitting unit of the target mobile platform to obtain the GPS data of the target mobile platform, and obtain the real-time motion parameter of the target mobile platform based on a computer vision algorithm. The unmanned aerial vehicle can predict the state parameters of the target mobile platform at the future moment based on the first preset algorithm according to the GPS data of the target mobile platform and the real-time motion parameters of the target mobile platform, and meanwhile, the target mobile platform is positioned in real time.
In another embodiment, the target mobile platform may generate GPS data of the target mobile platform through a GPS positioning system thereon, generate motion parameters of the target mobile platform including velocity, angle, angular velocity, acceleration, and the like through a sensor system, and transmit a signal including the GPS data and the motion parameters of the mobile platform through a signal transmitting unit. And the unmanned aerial vehicle receives the state parameters of the target mobile platform transmitted by the signal transmitting unit through the signal receiving unit. Therefore, the state parameters of the target mobile platform can be acquired by the target mobile platform, and compared with the state parameters of the target mobile platform acquired by the unmanned aerial vehicle through algorithm processing, the calculation amount of the unmanned aerial vehicle can be reduced. Optionally, the target mobile platform may generate real-time GPS data of the target mobile platform through a GPS positioning system thereon, and generate real-time motion parameters of the target mobile platform through a sensor system. The unmanned aerial vehicle receives the real-time state parameters of the target mobile platform transmitted by the signal transmitting unit in real time through the signal receiving unit so as to realize real-time positioning of the target platform.
In this embodiment, unmanned aerial vehicle obtains target moving platform's state parameter, includes: calculating motion parameters (including relative speed, angle, angular velocity, acceleration and the like) of the preset landing point label relative to the unmanned aerial vehicle based on a calculation vision algorithm; and determining the motion parameters of the target mobile platform according to the motion parameters of the preset landing point tag relative to the unmanned aerial vehicle.
Step S103: and planning a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform.
In this embodiment, after obtaining the state parameter of the target mobile platform, the unmanned aerial vehicle may plan a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point tag based on the second preset algorithm according to the state parameter of the unmanned aerial vehicle and the state parameter of the target mobile platform. The second preset algorithm comprises a minimum snap project algorithm and/or a minimum jerk project algorithm.
In the embodiment, a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label is planned according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform, and the method comprises the steps of calculating the relative state parameters of the unmanned aerial vehicle and the target mobile platform according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and planning a flight path of the unmanned aerial vehicle to land to a preset landing point label according to the relative state parameters.
In order to improve the preparation of the flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label, the unmanned aerial vehicle can predict the running state of the target mobile platform at the future moment based on a second preset algorithm according to the state parameters of the target mobile platform; and according to the obtained state parameters of the unmanned aerial vehicle, the prediction of the state parameters of the unmanned aerial vehicle at the future moment, the state parameters of the target mobile platform and the prediction of the state parameters of the target mobile platform at the future moment, the flight path planning of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label on the target mobile platform is carried out based on the second preset algorithm.
It can be understood that for further improving the accuracy of path planning, when the flight path planning of the preset landing point label of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located is carried out based on the second preset algorithm, the obstacle between the unmanned aerial vehicle and the target mobile platform can be taken into account. Namely, the unmanned aerial vehicle can also detect obstacles through a monocular/binocular vision measurement system; the unmanned aerial vehicle predicts the state parameters of the target mobile platform based on the second preset algorithm according to the condition of the obstacle and the state parameters of the target mobile platform, and performs flight path planning of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to a preset landing point label of the target mobile platform based on the second preset algorithm according to the condition of the obstacle, the state parameters of the target mobile platform, the prediction of the state parameters of the target mobile platform at the future moment, the state parameters of the unmanned aerial vehicle and the prediction of the state parameters of the unmanned aerial vehicle at the future moment. It can be understood that because the mobile platform and the unmanned aerial vehicle are both in a mobile state, the unmanned aerial vehicle can plan a plurality of candidate flight paths based on the second preset algorithm according to the barrier condition, the state parameters of the target mobile platform, the prediction of the state parameters of the mobile platform at the future time, the state parameters of the unmanned aerial vehicle and the prediction of the state parameters of the unmanned aerial vehicle at the future time.
Step S104: and controlling the unmanned aerial vehicle to travel along the flight path so as to land on the preset landing point label of the target mobile platform.
After the candidate flight paths of the preset landing tags for landing the unmanned aerial vehicle to the mobile platform are planned, the unmanned aerial vehicle can determine a target flight path from the multiple candidate flight paths based on a fourth preset algorithm and control the unmanned aerial vehicle to travel along the target flight path so as to land on the preset landing point tags of the target mobile platform. The fourth predetermined algorithm includes at least one of a pid (contribution Integral differential) algorithm, an adrc (auto Disturbances rejection Control) algorithm, an MPC (Model Predictive Control) algorithm, and an lqr (linear Predictive controller) algorithm. The unmanned aerial vehicle adjusts the motion parameters of the unmanned aerial vehicle in real time based on the fourth preset algorithm in the process of controlling the unmanned aerial vehicle to travel along the target flight path; and when the unmanned aerial vehicle travels to a preset height (the preset height can be less than 10cm for example) away from the preset landing point tag and the current state parameter of the unmanned aerial vehicle is matched with the current state parameter of the target mobile platform, the unmanned aerial vehicle is controlled to land on the preset landing point tag of the target mobile platform. At control unmanned aerial vehicle and advance along the flight path, in-process on the predetermined landing point label of descending to target moving platform, unmanned aerial vehicle can acquire the inclination of unmanned aerial vehicle in each position in real time through sensor system, the height apart from ground, information such as navigation speed, and confirm target moving platform for unmanned aerial vehicle's position and distance in real time based on calculating vision algorithm, then can be based on target removes the position and the distance of adding for unmanned aerial vehicle, adjust unmanned aerial vehicle's motion parameter in real time, so that unmanned aerial vehicle can comparatively accurately descend to target moving platform's predetermined landing point label.
In this embodiment, before step S101, the landing control method further includes the following steps.
Step S211: detecting the target mobile platform.
The unmanned aerial vehicle can detect the target mobile platform in real time by adopting a remote positioning method, for example, the unmanned aerial vehicle can detect the target mobile platform in real time by a GPS (global positioning system) arranged on the unmanned aerial vehicle and the GPS arranged on the target mobile platform, and the approximate position and distance of the target mobile platform relative to the unmanned aerial vehicle are determined after the target mobile platform is detected; or mobile network terminals can be respectively installed on the unmanned aerial vehicle and the target mobile platform, so that the unmanned aerial vehicle can detect the target mobile platform in real time based on a mobile network, and the approximate position and distance of the target mobile platform relative to the unmanned aerial vehicle can be determined after the target mobile platform is detected; or the unmanned aerial vehicle can detect the target mobile platform in real time through the detection radar, and determine the approximate direction and distance of the target mobile platform relative to the unmanned aerial vehicle after the target mobile platform is detected.
When the unmanned aerial vehicle determines that the distance between the target mobile platform and the unmanned aerial vehicle is larger than a preset value, the unmanned aerial vehicle navigates towards the target mobile platform, and when the distance between the target mobile platform and the unmanned aerial vehicle is determined to be smaller than the preset value, the unmanned aerial vehicle is switched to an accurate positioning method to position the target mobile platform. For example, positioning of the target mobile platform is realized by acquiring images and performing image recognition; or, an ultrasonic transmitter is arranged on the unmanned aerial vehicle, and the target moving platform is detected by utilizing ultrasonic waves; or, the positioning of the target moving platform is realized by arranging the laser detector on the unmanned aerial vehicle.
In this embodiment, because the target mobile platform is provided with the GPS positioning system and the signal transmitting unit of the target mobile platform transmits the signal containing the GPS data to the outside, when it is determined that the distance between the target mobile platform and itself is smaller than the preset value, the unmanned aerial vehicle can detect the target mobile platform by receiving the GPS signal.
Because it includes ultrasonic transmitter to predetermine the landing point label, consequently, when confirming that target moving platform is less than the default with self apart from, unmanned aerial vehicle can survey this target moving platform through receiving ultrasonic signal.
When a plurality of mobile platforms exist and the number of the mobile platforms with the landing point labels on the surfaces in the mobile platforms is larger than 1, the unmanned aerial vehicle can acquire images of the landing point labels of the mobile platforms with the landing point labels on the surfaces after determining that the distance between the target mobile platform and the unmanned aerial vehicle is smaller than a preset value, and the mobile platforms with the preset landing point labels on the surfaces are determined as the target mobile platforms. Specifically, when determining that the image of the landing point tag matches the pattern of the preset landing point tag, the unmanned aerial vehicle determines the mobile platform provided with the landing point tag as the target mobile platform.
When there are a plurality of moving platforms and target moving platform's the settlement of presetting the landing point label and include ultrasonic transmitter, unmanned aerial vehicle can be through receiving ultrasonic signal, according to ultrasonic signal tracking ultrasonic transmitter to confirm as target moving platform with the moving platform that ultrasonic transmitter belongs to.
Step S212: and after the target moving platform is detected and the distance between the target moving platform and the target moving platform is determined to be smaller than a preset value, determining the position of the preset falling point label.
Because the preset falling point label is arranged on the surface of the target moving platform, after the target moving platform is detected and the distance between the target moving platform and the unmanned aerial vehicle is determined to be smaller than the preset value, the unmanned aerial vehicle can acquire the surface image of the target moving platform, and the preset falling point label is identified from the surface image based on an image identification algorithm so as to determine the position of the preset falling point label. In the process of acquiring the surface image of the target mobile platform, the unmanned aerial vehicle can acquire the information such as the inclination angle of the unmanned aerial vehicle in each direction, the height from the ground, the navigation speed and the like in real time through the sensor system, and determine the direction and the distance of the target mobile platform relative to the unmanned aerial vehicle in real time based on a computational vision algorithm so as to provide reference for the flight path planning of the step S103.
Because the preset landing point label comprises the ultrasonic transmitter, the unmanned aerial vehicle can receive the ultrasonic signal transmitted by the ultrasonic transmitter after detecting the target moving platform and determining that the distance between the target moving platform and the unmanned aerial vehicle is smaller than the preset value, and the position of the preset landing point label is determined according to the ultrasonic signal.
It can be understood that laser detector can be further arranged on the unmanned aerial vehicle, and at the moment, after the unmanned aerial vehicle detects the target moving platform and determines that the distance between the target moving platform and the unmanned aerial vehicle is smaller than a preset value, the position of the preset landing point label can be determined through the laser detector.
According to the landing control method, the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform are obtained, then the flight path of the preset landing point label landed to the target mobile platform is planned according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform, and then the unmanned aerial vehicle is controlled to fly and land, so that the unmanned aerial vehicle can land on the target mobile platform, and the application range of the unmanned aerial vehicle can be widened.
The following briefly lists several application scenarios of the landing control method provided in the present application.
Scene 1: when executing special tasks such as rescue, fire control, unmanned aerial vehicle can carry out emergency rescue or fire detection according to the position that can't reach of arrival fire-fighting vehicle, and unmanned aerial vehicle usable landing control method that this application provided after accomplishing the rescue descends on the rescue car or the fire engine that the surface was provided with the removal of presetting the landing point label. Or, the method can be matched with a vehicle to carry out synchronous patrol, routing inspection and the like, and the mobile vehicle with the preset landing point label on the surface is landed by using the landing control method provided by the application after the patrol, routing inspection and other tasks are finished.
Scene 2: utilize the descending control method that this application provided, can make unmanned aerial vehicle when continuation of the journey electric quantity is lower, descend and charge to the vehicle that the surface was provided with the removal of presetting the landing point label. The charging mode comprises wired charging and wireless charging. When the charging mode is wireless charging, the landing control method that can utilize this application to provide makes unmanned aerial vehicle and vehicle advance in step, sets up wireless module of charging respectively through in the predetermined landing point label position on vehicle surface and unmanned aerial vehicle and realizes wireless charging.
Scene 3: when the target mobile platform is a vehicle and the vehicle runs to an area with weak GPS signals or no GPS signals, the GPS signals are relatively good and the visual field of the air position is relatively wide due to the fact that the unmanned aerial vehicle flies in the air, the unmanned aerial vehicle can acquire topographic images and road condition information and send the topographic images and the road condition information to a navigation system of the vehicle to determine a proper navigation route plan for the vehicle, and after the navigation route plan is completed, the unmanned aerial vehicle can land to a preset landing point label position on the surface of the vehicle through the landing control method provided by the application.
Scene 4: on the way in the travel of driving, usable unmanned aerial vehicle shoots, accomplishes the back of shooing at unmanned aerial vehicle, and the landing control method that usable this application provided descends to setting up in the predetermined landing point label position on automobile body surface.
Scene 5: when carrying out work such as freight transportation or goods delivery, usable unmanned will transport the goods or deliver to the freight train and can't arrive the place, after accomplishing freight transportation or delivering, utilize the landing control method that this application provided to land to the position that the freight train surface was provided with the label of predetermineeing the landing point.
Scene 6: in the process of shooting videos, when shooting a moving scene (for example, a scene in the process of driving an actor), because the relative stillness between the vehicle where the photographer is and the vehicle where the actor is to be kept is troublesome, after the vehicle where the actor is located is shot by the unmanned aerial vehicle, the vehicle can be landed to the preset landing point label position on the surface of the vehicle where the photographer is located by using the landing control method provided by the application.
Scene 7: in the process of marine safety and monitoring, the unmanned aerial vehicle can land on a cruising ship by utilizing the landing control method provided by the application when finishing safety inspection and illegal activity monitoring.
Referring to fig. 5, based on the same inventive concept, an embodiment of the present application further provides a landing control system 30, applied to an unmanned aerial vehicle, including: the detection module 31 is configured to acquire state parameters of the unmanned aerial vehicle and state parameters of a target mobile platform, where a preset landing point tag is arranged on the surface of the target mobile platform; the processing module 32 is configured to plan a flight path of the unmanned aerial vehicle from a position where the unmanned aerial vehicle is located to land to the preset landing point tag according to the state parameter of the unmanned aerial vehicle and the state parameter of the target mobile platform; and a control module 33, configured to control the unmanned aerial vehicle to travel along the flight path to land on the preset landing point tag of the target mobile platform.
The detection module 31 is further configured to detect the target moving platform, and determine the position of the preset landing point tag after detecting the target moving platform.
The preset landing point tag includes patterns of different shapes, and the detection module 31 is configured to acquire a surface image of the target mobile platform, and identify the preset landing point tag from the surface image, so as to determine a position of the preset landing point tag.
The preset landing point tag comprises an ultrasonic transmitter, and the detection module 31 is configured to receive an ultrasonic signal transmitted by the ultrasonic transmitter, and determine the position of the preset landing point tag according to the ultrasonic signal.
The detection module 31 is further configured to detect a mobile platform, and determine the mobile platform with the preset landing point tag on the surface as the target mobile platform.
The preset falling point tags comprise patterns with different shapes, and the detection module 31 is further configured to acquire images of the falling point tags of a mobile platform on the surface of which the falling point tags are arranged, and determine the mobile platform provided with the falling point tags as the target mobile platform when determining that the images of the falling point tags are matched with the patterns of the preset falling point tags.
The preset landing point tag comprises an ultrasonic transmitter, the detection module 31 is further configured to receive an ultrasonic signal, track the ultrasonic transmitter according to the ultrasonic signal, and determine the mobile platform where the ultrasonic transmitter is located as the target mobile platform.
The state parameters of the target mobile platform include motion parameters of the target mobile platform, and the detection module 31 includes: a calculating unit 311, configured to calculate a motion parameter of the landing point tag relative to the drone; a determining unit 312, configured to determine a motion parameter of the target mobile platform according to a motion parameter of the landing point tag relative to the drone.
The processing module 32 is configured to calculate a relative state parameter between the unmanned aerial vehicle and the target mobile platform according to the state parameter of the unmanned aerial vehicle and the state parameter of the target mobile platform, and plan a flight path of the unmanned aerial vehicle from a position where the unmanned aerial vehicle is located to the preset landing point tag according to the relative state parameter.
The control module 33 is further configured to adjust a state parameter of the drone in real time during the process of controlling the drone to travel along the flight path; and when the unmanned aerial vehicle travels to a distance to the preset height of the preset landing point label and the current state parameter of the unmanned aerial vehicle is matched with the current state parameter of the target mobile platform, controlling the unmanned aerial vehicle to land on the preset landing point label of the target mobile platform.
It can be understood that the landing control system provided by the present application corresponds to the landing control method provided by the present application, and for brevity of the description, the same or similar parts may refer to the contents of the landing control method part, and are not described herein again.
The various modules in the above described fall control system may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent of a processor in the server, and can also be stored in a memory in the server in a software form, so that the processor can call and execute operations corresponding to the modules. The processor can be a Central Processing Unit (CPU), a microprocessor, a singlechip and the like.
The above described landing control method and/or landing control system may be implemented in the form of computer readable instructions which may be run on a drone as shown in figure 3.
Based on the same inventive concept, embodiments of the present application provide a computer-readable storage medium, on which computer-readable instructions are stored, and when the program is executed by a processor, the program implements the steps in the landing control method described above.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (22)

1. A landing control method is applied to an unmanned aerial vehicle and comprises the following steps:
acquiring state parameters of the unmanned aerial vehicle;
acquiring state parameters of a target mobile platform, wherein a preset landing point label is arranged on the surface of the target mobile platform;
planning a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and
and controlling the unmanned aerial vehicle to travel along the flight path so as to land on the preset landing point label of the target mobile platform.
2. A landing control method according to claim 1, wherein prior to the obtaining of the state parameters of the target mobile platform, the method further comprises:
detecting the target mobile platform;
and after the target moving platform is detected and the distance between the target moving platform and the target moving platform is determined to be smaller than a preset value, determining the position of the preset falling point label.
3. A drop control method as in claim 2, wherein the preset drop point labels comprise patterns of different shapes, and wherein the determining the preset drop point labels comprises:
acquiring a surface image of the target mobile platform;
and identifying the preset drop point label from the surface image so as to determine the position of the preset drop point label.
4. The drop point control method of claim 2, wherein the predetermined drop point tag comprises an ultrasonic transmitter, and wherein the determining the predetermined drop point tag comprises:
receiving an ultrasonic signal transmitted by the ultrasonic transmitter;
and determining the position of the preset landing point label according to the ultrasonic signal.
5. A landing control method according to claim 1, wherein prior to the obtaining of the state parameters of the target mobile platform, the method further comprises:
detecting a mobile platform;
and determining the mobile platform with the preset landing point label on the surface as the target mobile platform.
6. A drop control method as defined in claim 5, wherein the preset drop point labels include patterns of different shapes, and determining the mobile platform having the preset drop point label provided on the surface thereof as the target mobile platform comprises:
acquiring an image of a landing point label of a mobile platform with the landing point label on the surface;
and when the image of the falling point label is determined to be matched with the pattern of the preset falling point label, determining the mobile platform provided with the falling point label as the target mobile platform.
7. A landing control method as defined in claim 5, wherein the predetermined landing point tag includes an ultrasonic transmitter, and the determining of the mobile platform having the predetermined landing point tag provided on the surface thereof as the target mobile platform includes:
receiving an ultrasonic signal;
and tracking the ultrasonic transmitter according to the ultrasonic signal, and determining the mobile platform where the ultrasonic transmitter is located as the target mobile platform.
8. A landing control method according to claim 1, wherein the state parameters of the target mobile platform include motion parameters of the target mobile platform, and the obtaining the state parameters of the target mobile platform includes:
calculating the motion parameters of the preset landing point labels relative to the unmanned aerial vehicle;
and determining the motion parameters of the target mobile platform according to the motion parameters of the preset landing point tag relative to the unmanned aerial vehicle.
9. A landing control method according to claim 1, wherein the planning of the flight path of the drone from its own location to the preset landing point tag according to the state parameters of the drone and the state parameters of the target mobile platform comprises:
calculating the relative state parameters of the unmanned aerial vehicle and the target mobile platform according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform;
and planning a flight path of the unmanned aerial vehicle from the landing to the preset landing point label according to the relative state parameters.
10. A landing control method according to claim 1, wherein said controlling the drone to travel along the flight path to land on the preset landing point tag of the target mobile platform comprises:
adjusting the motion parameters of the unmanned aerial vehicle in real time in the process of controlling the unmanned aerial vehicle to travel along the flight path;
when the unmanned aerial vehicle travels to a distance, the preset height of the preset landing point label is preset, and the current state parameter of the unmanned aerial vehicle is matched with the current state parameter of the target mobile platform, the unmanned aerial vehicle is controlled to land on the preset landing point label of the target mobile platform.
11. A landing control system applied to an unmanned aerial vehicle comprises:
the detection module is used for acquiring the state parameters of the unmanned aerial vehicle and the state parameters of a target mobile platform, and a preset landing point label is arranged on the surface of the target mobile platform;
the processing module is used for planning a flight path of the unmanned aerial vehicle from the position where the unmanned aerial vehicle is located to the preset landing point label according to the state parameters of the unmanned aerial vehicle and the state parameters of the target mobile platform; and
and the control module is used for controlling the unmanned aerial vehicle to travel along the flight path so as to land on the preset landing point label of the target mobile platform.
12. A drop control system as in claim 11, wherein the detection module is further configured to detect the target mobile platform and determine the location of the predetermined drop point tag after detecting the target mobile platform.
13. A drop control system as in claim 12, wherein the predetermined drop point tags comprise patterns of different shapes, and the detection module is configured to capture a surface image of the target mobile platform and identify the predetermined drop point tags from the surface image to determine the location of the predetermined drop point tags.
14. A drop control system as claimed in claim 12, wherein the predetermined drop point tag comprises an ultrasonic transmitter, and the detection module is configured to receive an ultrasonic signal transmitted by the ultrasonic transmitter and determine the location of the predetermined drop point tag based on the ultrasonic signal.
15. A drop control system as claimed in claim 11, wherein the detection module is further configured to detect a mobile platform and determine a mobile platform having the predetermined drop point tag disposed on a surface thereof as the target mobile platform.
16. A drop control system as defined in claim 15, wherein the preset drop point tags include patterns of different shapes, the detection module being further configured to obtain an image of a drop point tag of a mobile platform having a drop point tag disposed on a surface thereof, and determine the mobile platform having the drop point tag disposed thereon as the target mobile platform upon determining that the image of the drop point tag matches the pattern of the preset drop point tag.
17. A drop control system as claimed in claim 15, wherein the predetermined drop point tag comprises an ultrasonic transmitter, and the detection module is further configured to receive an ultrasonic signal and track the ultrasonic transmitter based on the ultrasonic signal to determine the mobile platform on which the ultrasonic transmitter is located as the target mobile platform.
18. A fall control system according to claim 11, wherein the state parameters of the target mobile platform include motion parameters of the target mobile platform, and the detection module comprises:
the calculating unit is used for calculating the motion parameters of the landing point label relative to the unmanned aerial vehicle;
and the determining unit is used for determining the motion parameters of the target mobile platform according to the motion parameters of the landing point tag relative to the unmanned aerial vehicle.
19. A landing control system according to claim 11, wherein the processing module is configured to calculate a relative status parameter between the drone and the target mobile platform according to the status parameter of the drone and the status parameter of the target mobile platform, and plan a flight path for the drone to land from its own location to the preset landing point tag according to the relative status parameter.
20. A landing control system according to claim 11, wherein the control module is further configured to adjust the state parameters of the drone in real time during the control of the drone to travel along the flight path; and when the unmanned aerial vehicle travels to a distance to the preset height of the preset landing point label and the current state parameter of the unmanned aerial vehicle is matched with the current state parameter of the target mobile platform, controlling the unmanned aerial vehicle to land on the preset landing point label of the target mobile platform.
21. A drone comprising a memory and a processor, the memory having stored therein computer readable instructions that, when executed by the processor, cause the processor to perform the landing control method of any one of claims 1 to 10 or to implement the functionality of the landing control system of any one of claims 11 to 20.
22. A non-transitory readable storage medium storing computer readable instructions which, when executed by a processor, cause the processor to perform a fall control method according to any one of claims 1 to 10 or to implement the functionality of a fall control system according to any one of claims 11 to 20.
CN201910849942.XA 2019-09-09 2019-09-09 Landing control method, system, unmanned aerial vehicle and storage medium Pending CN110687928A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910849942.XA CN110687928A (en) 2019-09-09 2019-09-09 Landing control method, system, unmanned aerial vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910849942.XA CN110687928A (en) 2019-09-09 2019-09-09 Landing control method, system, unmanned aerial vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN110687928A true CN110687928A (en) 2020-01-14

Family

ID=69108023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910849942.XA Pending CN110687928A (en) 2019-09-09 2019-09-09 Landing control method, system, unmanned aerial vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN110687928A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111813148A (en) * 2020-07-22 2020-10-23 广东工业大学 Unmanned aerial vehicle landing method, system, equipment and storage medium
CN112099527A (en) * 2020-09-17 2020-12-18 湖南大学 Control method and system for autonomous landing of mobile platform of vertical take-off and landing unmanned aerial vehicle
CN112319804A (en) * 2020-11-04 2021-02-05 北京京东乾石科技有限公司 Control method and device for unmanned aerial vehicle
CN112660011A (en) * 2020-12-23 2021-04-16 海南电网有限责任公司琼海供电局 Unmanned aerial vehicle intelligent inspection operation vehicle for power transmission line
CN113448345A (en) * 2020-03-27 2021-09-28 北京三快在线科技有限公司 Unmanned aerial vehicle landing method and device
CN114435614A (en) * 2020-11-05 2022-05-06 北星空间信息技术研究院(南京)有限公司 Method for dynamically landing ground robot by unmanned aerial vehicle
CN114489130A (en) * 2022-01-25 2022-05-13 中国民用航空总局第二研究所 Unmanned aerial vehicle ground scheduling equipment, method and device
CN114545957A (en) * 2022-01-25 2022-05-27 中国舰船研究设计中心 Unmanned aerial vehicle retrieves bootstrap system
CN114935946A (en) * 2022-07-21 2022-08-23 浙江这里飞科技有限公司 Unmanned aerial vehicle landing method and device
WO2022261901A1 (en) * 2021-06-17 2022-12-22 深圳市大疆创新科技有限公司 Unmanned aerial vehicle landing control method and apparatus, unmanned aerial vehicle, system, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298389A (en) * 2011-06-10 2011-12-28 清华大学 System fully controlled and taken over by ground station during takeoff and landing stages of unmanned plane
CN108873930A (en) * 2018-05-31 2018-11-23 苏州市启献智能科技有限公司 Unmanned plane landing method and system based on mobile platform
CN108873917A (en) * 2018-07-05 2018-11-23 太原理工大学 A kind of unmanned plane independent landing control system and method towards mobile platform
CN109641652A (en) * 2017-02-28 2019-04-16 深圳市大疆创新科技有限公司 Unmanned plane landing control method, device and unmanned plane
CN109753079A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands in mobile platform method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298389A (en) * 2011-06-10 2011-12-28 清华大学 System fully controlled and taken over by ground station during takeoff and landing stages of unmanned plane
CN109641652A (en) * 2017-02-28 2019-04-16 深圳市大疆创新科技有限公司 Unmanned plane landing control method, device and unmanned plane
CN109753079A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands in mobile platform method
CN108873930A (en) * 2018-05-31 2018-11-23 苏州市启献智能科技有限公司 Unmanned plane landing method and system based on mobile platform
CN108873917A (en) * 2018-07-05 2018-11-23 太原理工大学 A kind of unmanned plane independent landing control system and method towards mobile platform

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448345A (en) * 2020-03-27 2021-09-28 北京三快在线科技有限公司 Unmanned aerial vehicle landing method and device
CN111813148A (en) * 2020-07-22 2020-10-23 广东工业大学 Unmanned aerial vehicle landing method, system, equipment and storage medium
CN111813148B (en) * 2020-07-22 2024-01-26 广东工业大学 Unmanned aerial vehicle landing method, system, equipment and storage medium
CN112099527A (en) * 2020-09-17 2020-12-18 湖南大学 Control method and system for autonomous landing of mobile platform of vertical take-off and landing unmanned aerial vehicle
CN112319804A (en) * 2020-11-04 2021-02-05 北京京东乾石科技有限公司 Control method and device for unmanned aerial vehicle
CN112319804B (en) * 2020-11-04 2022-06-07 北京京东乾石科技有限公司 Control method and device for unmanned aerial vehicle
CN114435614A (en) * 2020-11-05 2022-05-06 北星空间信息技术研究院(南京)有限公司 Method for dynamically landing ground robot by unmanned aerial vehicle
CN112660011A (en) * 2020-12-23 2021-04-16 海南电网有限责任公司琼海供电局 Unmanned aerial vehicle intelligent inspection operation vehicle for power transmission line
WO2022261901A1 (en) * 2021-06-17 2022-12-22 深圳市大疆创新科技有限公司 Unmanned aerial vehicle landing control method and apparatus, unmanned aerial vehicle, system, and storage medium
CN114489130A (en) * 2022-01-25 2022-05-13 中国民用航空总局第二研究所 Unmanned aerial vehicle ground scheduling equipment, method and device
CN114489130B (en) * 2022-01-25 2023-09-12 中国民用航空总局第二研究所 Unmanned aerial vehicle ground scheduling equipment, method and device
CN114545957A (en) * 2022-01-25 2022-05-27 中国舰船研究设计中心 Unmanned aerial vehicle retrieves bootstrap system
CN114935946A (en) * 2022-07-21 2022-08-23 浙江这里飞科技有限公司 Unmanned aerial vehicle landing method and device

Similar Documents

Publication Publication Date Title
CN110687928A (en) Landing control method, system, unmanned aerial vehicle and storage medium
CN108227751B (en) Landing method and system of unmanned aerial vehicle
EP3077879B1 (en) Imaging method and apparatus
JP2021513714A (en) Aircraft smart landing
JP2019532292A (en) Autonomous vehicle with vehicle location
JP2015006874A (en) Systems and methods for autonomous landing using three dimensional evidence grid
US20210109546A1 (en) Predictive landing for drone and moving vehicle
EP3077760B1 (en) Payload delivery
EP3077880B1 (en) Imaging method and apparatus
KR101421172B1 (en) Unmanned transport vehicles with shuttle robot platform
US11726501B2 (en) System and method for perceptive navigation of automated vehicles
CN111123964B (en) Unmanned aerial vehicle landing method and device and computer readable medium
CN113156998B (en) Control method of unmanned aerial vehicle flight control system
EP4042105B1 (en) Map including data for routing aerial vehicles during gnss failure
JP7190699B2 (en) Flight system and landing control method
EP4148385A1 (en) Vehicle navigation positioning method and apparatus, and base station, system and readable storage medium
CN111693052A (en) Unmanned aerial vehicle navigation method and device, unmanned aerial vehicle and storage medium
JP2017206072A (en) Flight control device and flight control method
KR102368082B1 (en) Autonomous driving control method of a Robot and System therefor
CN116430901A (en) Unmanned aerial vehicle return control method and system based on mobile parking apron
CN113758482A (en) Vehicle navigation positioning method, device, base station, system and readable storage medium
GB2522328A (en) Payload delivery
CN115334130A (en) Intelligent network operation service platform and method
JP2020052660A (en) Flying robot and monitoring system
CN114721441A (en) Multi-information-source integrated vehicle-mounted unmanned aerial vehicle autonomous landing control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200114

RJ01 Rejection of invention patent application after publication