CN113759943A - Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system - Google Patents

Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system Download PDF

Info

Publication number
CN113759943A
CN113759943A CN202111190817.6A CN202111190817A CN113759943A CN 113759943 A CN113759943 A CN 113759943A CN 202111190817 A CN202111190817 A CN 202111190817A CN 113759943 A CN113759943 A CN 113759943A
Authority
CN
China
Prior art keywords
landing
unmanned aerial
aerial vehicle
landing platform
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111190817.6A
Other languages
Chinese (zh)
Inventor
刘书林
徐彬
樊伟
刘春桃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Chongqing Innovation Center of Beijing University of Technology
Original Assignee
Beijing Institute of Technology BIT
Chongqing Innovation Center of Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT, Chongqing Innovation Center of Beijing University of Technology filed Critical Beijing Institute of Technology BIT
Priority to CN202111190817.6A priority Critical patent/CN113759943A/en
Publication of CN113759943A publication Critical patent/CN113759943A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned aerial vehicle landing platform, an identification method, a landing method and a flight operation system. Set up many sizes, nested formula pattern on the descending platform, unmanned aerial vehicle reachs the landing point after, at first the automatic identification host pattern, along with shortening of distance, the first secondary pattern of automatic identification. When the position is shifted due to disturbance of the airflow, a second secondary pattern around the main pattern is automatically identified. Therefore, the unmanned aerial vehicle can be guided by the mark image in the whole landing process, and accurate landing is realized. The landing process is designed in a segmented mode and comprises three stages of initial acceleration, uniform speed flight and terminal deceleration, and the landing platform is just landed when the speed dimension reduction zero is met, so that stable landing is achieved. The unmanned aerial vehicle landing system can guide the unmanned aerial vehicle to land to a specified position accurately, stably and efficiently, can identify a pattern target in a self-adaption mode in the landing process, and has high robustness and strong anti-interference performance.

Description

Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system
Technical Field
The invention relates to the field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle landing platform, a landing platform identification method, an unmanned aerial vehicle method and a flight operation system.
Background
Along with the gradual expansion of the application range of the unmanned aerial vehicle technology in the military and civil fields, the unmanned aerial vehicle with the accurate landing function is more and more concerned, such as accurate landing, carrier landing, fixed-point launching and the like. Conventional guidance techniques include inertial guidance, radar guidance, high-precision satellite guidance, and the like. The position error of inertial navigation is accumulated and increased along with time, and the navigation precision is influenced; the positioning accuracy of the radar is limited, and the equipment is complex; high-precision satellite guidance is easy to interfere by means of satellite signals, and reliability is difficult to guarantee in the final phase reduction. The visual guidance is based on an image processing technology, images acquired by the camera are correspondingly processed, pose information in the target motion process is solved by combining camera internal parameters, constraint condition information and the like, so that the aircraft is controlled to land.
The vision measurement mainly divide into monocular vision measurement and binocular vision measurement two kinds, and binocular vision measurement needs install two cameras additional on unmanned aerial vehicle, and camera installation accuracy requires highly, realizes complicacy, and the measuring distance requires the baseline more long more far away, installs the difficulty to unmanned aerial vehicle. The monocular vision measurement only needs one camera, and the size information of the target pattern is combined, so that the relative positioning can be realized, the system is simple in structure, the installation requirement is low, and the monocular vision measurement has great advantages for the application scene of the appointed landing point.
At present, the design of the landing system of the unmanned aerial vehicle by means of monocular vision measurement is that only one pattern is designed on a landing platform, and the unmanned aerial vehicle can realize autonomous landing of the unmanned aerial vehicle by positioning the pattern and tracking. And unmanned aerial vehicle is at the landing in-process, because with the change of the distance and the position of descending the board, its field of vision can change, along with the drawing close of distance, can lose the target when being close to the landing platform, and the deviation very easily appears in the position of final landing, is descending the precision and is low. On the other hand, the air current disturbance can lead to the unmanned aerial vehicle position to deviate from the landing plate center, can only "see" local position and can't obtain the pattern general picture, and the general design is after pulling up unmanned aerial vehicle to a certain position, relocates the pattern position, will increase like this and fall the time.
Disclosure of Invention
The invention aims to: to the problem that above-mentioned exists, provide a descending platform that guides unmanned aerial vehicle to descend to the landing platform on high-efficient, accurate ground of guide unmanned aerial vehicle. The invention also provides a landing platform identification method, so that the landing platform can be accurately identified at any visual angle. The invention also provides a method for guiding the unmanned aerial vehicle to land based on the multi-size nested patterns and a system for guiding the unmanned aerial vehicle to land based on the multi-size nested patterns, so as to guide and control the unmanned aerial vehicle to land with high precision and high efficiency.
The technical scheme adopted by the invention is as follows:
the utility model provides an unmanned aerial vehicle landing platform, landing platform central authorities are provided with the identification image, the identification image includes the pattern of many sizes, nested formula design.
The identification image comprises a plurality of patterns with multiple sizes and nested design, so that the unmanned aerial vehicle always has the pattern to play a guiding role in the identification image in the landing process no matter whether the pose changes, namely, the unmanned aerial vehicle does not lose a tracking target in the landing process no matter whether the pose changes.
Further, the identification image comprises a plurality of patterns, the patterns comprise main patterns, a first secondary pattern is nested in the main patterns, and a plurality of second secondary patterns are circumferentially arranged around the main patterns.
In the landing process from far to near, the unmanned aerial vehicle can firstly identify all patterns, can select the main pattern as the pattern of the guide, and then, the visual field is continuously reduced until the visual field can not accommodate the whole main pattern, and only can identify the first secondary pattern in the main pattern, so as to serve as the guide of the next landing. When the unmanned aerial vehicle is influenced by air current and the like and has a position offset, the main pattern or the first secondary pattern may not be identified any more, and at the moment, the second secondary pattern distributed around the main pattern plays a role. The unmanned aerial vehicle recognizes a second secondary pattern on a part of a landing platform, and the second secondary pattern is used as a guide to continue landing under the guide. Because the size and the position of each pattern are determined, the spatial position relation between the secondary pattern and the main pattern is determined, the unmanned aerial vehicle recognizes the secondary pattern, and the spatial position relation between the unmanned aerial vehicle and the main pattern can be determined through simple conversion.
A landing platform identification method comprises the step of identifying a landing platform from a collected image, wherein the step of identifying the landing platform comprises the process of identifying a multi-size nested pattern design identification image in the image.
Furthermore, the identification image comprises a plurality of patterns, and each pattern in the identification image is respectively provided with an identification priority, so that the associated information of the identified pattern with the highest priority is used as an identification result of the landing platform.
Further, the pattern in the identification image is designed with at least three levels of recognition priority; the highest priority identifies a primary pattern centrally located within the identification image, the second priority identifies a first secondary pattern nested within the primary pattern, and the third priority identifies a second secondary pattern circumferentially distributed around the primary pattern.
Furthermore, each pattern is respectively configured with a unique identifier, and the identifier of each pattern is configured with an identification priority, so that the identification priority of each pattern in the identification image is set.
An unmanned aerial vehicle landing method, comprising:
collecting an image; identifying a landing platform from the acquired image by adopting the landing platform identification method; calculating relative pose data between the unmanned aerial vehicle and the detected landing platform; calculating landing navigation information according to the relative pose data; and controlling the unmanned aerial vehicle to land under the guidance of the landing navigation information.
Further, the landing navigation information includes information of an initial acceleration stage, a uniform speed stage and a final deceleration stage of guiding the unmanned aerial vehicle to land, wherein:
an initial acceleration stage: accelerating the current speed of the unmanned aerial vehicle to a specified speed;
a uniform speed stage: keeping the uniform flying;
and a terminal deceleration stage: decelerating from the specified speed to zero and just landing to a landing platform when the speed decelerates to zero. Correspondingly, the landing navigation information contains speed control information corresponding to the three phases.
Further, the image is acquired when the drone is located within a predetermined height h above the landing point, the predetermined height h being determined by:
Figure BDA0003301102520000041
in the formula, f is the focal length of the camera, X' is the maximum value of the X axis of the imaging plane, and X is the maximum value of the X axis of the landing platform.
Further, the method also comprises an abnormal condition processing step:
if the landing platform is not identified from the acquired images, pulling up the unmanned aerial vehicle in situ to search for the landing platform again; if the landing platform is not searched after being pulled to the preset height, the unmanned aerial vehicle is controlled to take the landing point as the circle center to carry out circular flying search of the radius r to the landing platform, and if the landing platform is not searched yet, the unmanned aerial vehicle is controlled to land in an autonomous landing mode;
if the height of the unmanned aerial vehicle is lower than a set threshold value in the process of searching the landing platform, the unmanned aerial vehicle is controlled to land in an autonomous landing mode.
A flight operation system comprises at least one unmanned aerial vehicle and the unmanned aerial vehicle landing platform.
Further, each unmanned aerial vehicle all includes camera, airborne computer and the flight control computer that connects gradually, wherein:
the camera is configured to: collecting an image;
designing recognition priority for the patterns in the identification image by the onboard computer; the on-board computer is configured to: carrying out pattern recognition on the collected image, taking the associated information of the recognized pattern with the highest priority as a detection result of the landing platform, calculating relative pose information between the unmanned aerial vehicle and the landing platform, and calculating landing navigation information according to the relative pose information;
the flight control computer is configured to: and controlling the unmanned aerial vehicle to land according to the landing navigation information.
Further, the landing navigation information includes information of an initial acceleration stage, a uniform speed stage and a final deceleration stage of guiding the unmanned aerial vehicle to land, wherein:
an initial acceleration stage: accelerating the current speed of the unmanned aerial vehicle to a specified speed;
a uniform speed stage: keeping the uniform flying;
and a terminal deceleration stage: decelerating from the specified speed to zero and just landing to a landing platform when the speed decelerates to zero.
Further, the onboard computer triggers the camera to capture an image when the drone is located within a predetermined height h above the landing point, the predetermined height h being determined by:
Figure BDA0003301102520000051
in the formula, f is the focal length of the camera, X' is the maximum value of the X axis of the imaging plane, and X is the maximum value of the X axis of the landing platform.
Further, an abnormal situation handling response action is configured in the onboard computer; the on-board computer is configured to: if the landing platform is not identified from the acquired images, sending an instruction to a flight control computer to control the unmanned aerial vehicle to be pulled up in situ and control the camera to acquire images again; if the landing platform cannot be identified from the reacquired image after being lifted to the preset height, sending an instruction to a flight control computer to control the unmanned aerial vehicle to take the landing point as the circle center and carry out circular flight with the radius r, and controlling a camera to reacquire the image; if the landing platform cannot be identified from the re-acquired image, sending an instruction to a flight control computer to control the unmanned aerial vehicle to land in an autonomous landing mode;
the on-board computer is configured to: and in the process of executing the abnormal condition processing response action, if the height of the unmanned aerial vehicle is lower than a set threshold value, sending an instruction to the flight control computer to control the unmanned aerial vehicle to land in an autonomous landing mode.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the landing platform disclosed by the invention has the advantages that the smart content is rich, the multi-size nested identification image is designed, the whole landing process of the unmanned aerial vehicle can be comprehensively guided, and the unmanned aerial vehicle can land to the center of the landing platform accurately and quickly.
2. In the process of solving the relative pose relation between the unmanned aerial vehicle and the landing platform, the method has the advantages of abundant pattern content, fixed relative positions between the patterns, sufficient prior information and simple and reliable calculation result.
3. According to the unmanned aerial vehicle landing method, the identification priority of the identification image is set, and the pattern target can be identified in a self-adaptive manner in the landing process of the unmanned aerial vehicle, so that the landing process has higher robustness.
4. The system has simple structure, only needs conventional devices and simple configuration, and has low installation difficulty and strong anti-interference capability.
5. According to the guiding scheme, the landing process is designed in a segmented mode by combining a real-time landing path, and the flight characteristics of the unmanned aerial vehicle are fully considered, so that the landing process of the unmanned aerial vehicle is more stable and reliable.
Drawings
The invention will now be described, by way of example, with reference to the accompanying drawings, in which:
FIG. 1 is a schematic view of a pinhole imaging model.
Fig. 2 is a flow chart of landing platform detection.
FIG. 3 is one embodiment of a logo image design.
Fig. 4 is a schematic diagram of a relative pose information calculation model.
FIG. 5 is a diagram of a ROS software design framework.
Detailed Description
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification (including any accompanying claims, abstract) may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.
Example one
The embodiment discloses an unmanned aerial vehicle landing method which comprises the steps of collecting images, identifying a landing platform from the collected images, calculating relative pose data between the unmanned aerial vehicle and the detected landing platform, calculating landing navigation information according to the relative pose data, and controlling the unmanned aerial vehicle to land under the guidance of the landing navigation information.
And the unmanned aerial vehicle flies normally according to the planning navigation information, and acquires a lower image after flying to an overhead effective area above a landing point or receiving a landing instruction.
The landing point is a destination position set in the navigation information. The effective area above the landing point is a preset height above the landing point, the height h of the unmanned aerial vehicle is comprehensively determined by internal parameters of a camera and the size of a landing plate, and as shown in fig. 1, h is calculated by a pinhole imaging model to obtain:
Figure BDA0003301102520000071
wherein: f is the focal length of the camera, X' is the maximum value of the X axis of the imaging plane, and X is the maximum value of the X axis of the landing platform.
Usually, establish at a two-dimensional code (or other patterns) at the landing platform center and can realize that unmanned aerial vehicle is to the detection discernment of landing platform, of course, the size of the two-dimensional code that sets up also can not be too little, even if the high unmanned aerial vehicle of height has gathered the image that contains the landing platform, also can't discern the two-dimensional code in the image. However, during the landing process of the unmanned aerial vehicle, due to the change of the distance and the position from the landing platform, the visual field of the unmanned aerial vehicle can be changed: as the distance is reduced, the field of view may be reduced, and the airflow disturbance may cause the drone to be positioned off the center of the landing platform, and only a part of the landing platform, such as a corner, may be seen. Like this, unmanned aerial vehicle is unless drawing high flying height and discerning the two-dimensional code again, can't realize the location to the two-dimensional code, also can't carry out accurate autonomic descending (promptly the landing position deviates to other positions of landing platform).
In this embodiment, the landing platform has passed through special design, and the landing platform central authorities that guide the accurate landing of unmanned aerial vehicle have set up the identification image, so-called identification image, supply the unmanned aerial vehicle to carry out visual identification promptly in order to confirm the image of landing position, and this identification image includes the pattern of many sizes, nested design, and the pattern is as the object of unmanned aerial vehicle discernment, through the position of pattern location landing platform to acquire the position appearance relation between unmanned aerial vehicle and the landing platform.
The logo image is designed with a plurality of patterns, and it can be understood that the plurality of patterns constitute the logo image. The sizes of the patterns are not all the same, and partial patterns are arranged in a nested manner to adapt to the change of the visual field size in the landing process of the unmanned aerial vehicle, and are distributed in a surrounding manner to adapt to the situation that the unmanned aerial vehicle deflects, so that the landing platform can be captured.
In some embodiments, among the plurality of patterns contained in the identification image, there is a main pattern, a first secondary pattern and a plurality of second secondary patterns, the main pattern being disposed in the center of the landing platform, the first secondary pattern being nestingly disposed in the center of the main pattern, and the second secondary patterns being circumferentially disposed around the main pattern, preferably uniformly distributed. It should be noted that the second secondary patterns are circumferentially distributed around the main pattern, and are not limited to only one layer around the main pattern, and may be a plurality of layers of second secondary patterns designed around the main pattern. For example, a layer of the second secondary pattern is distributed next to the primary image, and outside the layer of the second secondary pattern, a layer of the second secondary pattern is also distributed, and so on. The number of the second secondary patterns of each layer is not limited, and may be one or more.
In some embodiments, the number of the second secondary patterns is four, and the second secondary patterns are respectively arranged in four directions of the upper direction, the lower direction, the left direction and the right direction of the main pattern; alternatively, the four second secondary patterns are respectively arranged in four directions of upper left, sitting, upper right and lower right of the main pattern. The four second secondary patterns are on the same circumference, although this does not exclude that they may not be on the same circumference.
The specific structure of the pattern can take various forms, so that the basic requirement of computer vision recognition can be met. In some embodiments, the patterns are designed in a two-dimensional code form, and correspondingly, the identification image includes a main two-dimensional code, a first secondary two-dimensional code and a plurality of second secondary two-dimensional codes. As described above, the pattern in the two-dimensional code format is used in the following embodiments, but it is needless to say that other patterns can be used as the marker image, and the patterns in other formats are designed similarly to the patterns in the two-dimensional code format. In one embodiment, the first secondary two-dimensional code is designed inside the main two-dimensional code in a nested manner, and a plurality of second secondary two-dimensional codes (the number of the second secondary two-dimensional codes is preferably 4 or more, and the second secondary two-dimensional codes are uniformly arranged along the circumferential direction) are designed around the main two-dimensional code, and the size of the main two-dimensional code is much larger than that of the first secondary two-dimensional code and that of the second secondary two-dimensional code. As shown in fig. 3, in one embodiment, a second two-dimensional code 2 (a first secondary two-dimensional code) is nested and designed in the center inside a first two-dimensional code 1 (a main two-dimensional code), and 4 second secondary two-dimensional codes, namely a third two-dimensional code 3, a fourth two-dimensional code 4, a fifth two-dimensional code 5 and a sixth two-dimensional code 6, are respectively designed on four corners of the first two-dimensional code 1.
The design method of the landing platform comprises the step of designing the identification image on the unmanned aerial vehicle landing platform. Specifically, a logo image is arranged in the center of the landing platform. The identification image includes a plurality of patterns. Among the plurality of patterns contained in the identification image are a main pattern, a first secondary pattern and a second secondary pattern, wherein the main pattern is arranged in the center of the landing platform, the first secondary pattern is nested in the center of the main pattern, and the second secondary pattern is circumferentially arranged around the main pattern. Similarly, the design of the second secondary pattern may be a layer or multiple layers, and is distributed in the upper, lower, left, and right directions/regions of the main pattern, or in the upper left, lower left, upper right, and lower right directions/regions.
1. Step of identifying landing platform from captured image
The method for identifying the landing platform is a method for detecting the identification image on the landing platform from the acquired image. As mentioned above, the identification image includes a multi-sized, nestedly designed pattern, and the method for identifying the landing platform includes a process of identifying the multi-sized, nestedly designed identification image in the captured image.
Considering that in the same frame, the image collected by the unmanned aerial vehicle may contain a plurality of two-dimensional codes, for example, in the case that the unmanned aerial vehicle is far enough away, all the two-dimensional codes can be collected, and the two-dimensional code guiding the unmanned aerial vehicle to land each time should have identification uniqueness, for this reason, the main two-dimensional code is automatically selected to be identified instead of the first secondary two-dimensional code or the second secondary two-dimensional code, and the identification priority of the main two-dimensional code, the first secondary two-dimensional code, and the second secondary two-dimensional code can be set. Specifically, the main two-dimensional code is set to be the highest priority, the first secondary two-dimensional code is the second highest, and the second secondary two-dimensional code is the lowest priority. And in all the identified two-dimensional codes, the association information of the two-dimensional code with the highest priority is used as an identification result, namely, the unmanned aerial vehicle is guided to land in the identified pattern with the highest priority. The so-called association information is the position, size, and other attributes of the two-dimensional code. When the unmanned aerial vehicle is far enough away from the landing platform, the main two-dimensional code can be automatically selected and identified; with the distance being reduced, the visual field is reduced, and the first secondary two-dimensional code can be automatically selected and identified; and when the unmanned aerial vehicle position takes place the skew, can only "see" the part that descends the platform, then can automatic selection discernment secondary two-dimensional code this moment.
Each two-dimensional code in the marker image has uniqueness to determine the location captured by the drone. In some embodiments, the uniqueness of each two-dimensional code is determined by configuring one-to-one corresponding identifier for each main two-dimensional code, each first secondary two-dimensional code and each second secondary two-dimensional code; and the priority of the identification of each two-dimensional code is set, namely the setting of the identification priority of each two-dimensional code is realized.
For the identification design of each two-dimensional code, in some embodiments, each two-dimensional code is considered to be formed by arranging a plurality of color block arrays, values of each color block (black/white) are identified, for example, black is 1, white is 0, the sequentially identified color block values form a character string as an ID number (i.e., an identification) corresponding to the two-dimensional code, and the two-dimensional code corresponding to the ID number with the highest priority in the same frame is used as an identification target of the detection.
2. Calculating relative pose data between the unmanned aerial vehicle and the landing platform according to the detected landing platform
If the side length of each two-dimensional code is known, the position relationship between each non-main two-dimensional code and the main two-dimensional code can be simply deduced and can be configured as a known parameter. When unmanned aerial vehicle discernment non-main two-dimensional code, according to the position relation between each non-main two-dimensional code and the main two-dimensional code, can determine the position of this non-main two-dimensional code on descending platform. Still taking the example of designing four peripheral secondary two-dimensional codes in the above embodiment as an example, the main two-dimensional code is designed in the center of the landing platform, the center of the main two-dimensional code is taken as the origin of the coordinate system, and the coordinates of the corner point coordinates of each secondary two-dimensional code in the coordinate system are respectively distributed in four quadrants. Taking the third two-dimensional code 3 as an example, the corner point coordinates of the upper left corner of the third two-dimensional code 3 satisfy the following relation in a coordinate system with the landing platform as the origin, and the corner point coordinates of the other fourth to sixth two-dimensional codes are obtained by the same principle:
Figure BDA0003301102520000101
wherein: ll0- -side length of two-dimensional code 1 with four large outer periphery in middle and outer
The pinhole camera module shown in fig. 1 also illustrates the imaging principle of the camera. Based on the principle and the related theory, the corresponding relation between the three-dimensional coordinates of the point in the space where the landing platform is located and the pixel coordinates of the acquired image can be obtained. Taking the second secondary two-dimensional code corner point (x, y, z) as an example, the corner point is converted into a camera coordinate system through an external reference matrix [ R | t ] and then projected to an image plane through a projection model (camera internal reference matrix K). It has been stated above that the coordinates of the corner points are known quantities, the camera internal reference matrix K can be obtained by calibrating the camera, the pixel coordinates (u, v) of the corner points on the image are also known quantities, and the P4P equation is constructed and solved to calculate the relative position and relative attitude of the camera and the ground marker, as shown in fig. 4, so as to obtain the relative position and attitude [ R | t ] of the camera (i.e., the drone) relative to the landing plate.
Figure BDA0003301102520000111
Wherein: (x, y and z) are coordinates of angular points of the second secondary two-dimensional code, (u and v) are pixel coordinates of the angular points on the image, R is rotation amount of the attitude, and t is translation amount.
Pc is known as R Po + t, Pc is a point of the camera coordinate system, Po is a point value of the world coordinate system, and R, t is a relative reference of the world coordinate system and the camera coordinate system. When Po ═ 0; 0; and 0, Pc is t, namely the coordinate of the world coordinate origin (the center of the two-dimensional code) in the camera coordinate system is t.
In actual flight, in order to reduce interference caused by factors such as vibration and the like, the angle of the camera is fixed and fixedly connected to the four rotors, so that the relative position and posture of the four rotors relative to the ground marker can be solved through coordinate transformation of the coordinate of the camera in a body coordinate system.
3. And calculating landing navigation information according to the relative pose data.
In order to make unmanned aerial vehicle smooth flight, unmanned aerial vehicle reachs the landing point, after beginning to land, wholly divide into three stage: initial acceleration: by the current speed v of the dronecAcceleration to a specified speed v0(ii) a A uniform speed stage: hold v0Flying at a constant speed; end deceleration: from velocity v0And starting the deceleration flight, and reducing the speed to 0 to just fall to the landing platform.
The terminal deceleration of the fall should be such that:
Figure BDA0003301102520000121
the uniform speed stage meets the following conditions:
s2=v0t2
an initial acceleration stage:
Figure BDA0003301102520000122
the whole should satisfy:
s=s1+s2+s3
v0≤vmax
a≤amax
wherein: a ismaxMaximum acceleration of the drone, vmaxMaximum speed, v, of the drone0For the speed of the unmanned aerial vehicle flying at a uniform speed, vcIs the speed when the unmanned aerial vehicle begins to land, a is the acceleration when the unmanned aerial vehicle changes speed, t1、t2、t3The time of three stages of terminal deceleration, uniform speed and initial acceleration is s1、s2、s3The flight mileage corresponds to three stages of terminal deceleration, constant speed and initial acceleration, and S is the total flight mileage of the unmanned aerial vehicle.
Through the mathematical relation, the expected speed of the unmanned aerial vehicle at the corresponding position in the landing process can be calculated, and corresponding landing navigation information is formed from the expected speed
Figure BDA0003301102520000123
Figure BDA0003301102520000124
For three axes of desired velocity, where v0、vc、a、s、amax、vmaxIs a pre-designed known quantity.
If the landing object is a moving object and the moving speed is
Figure BDA0003301102520000125
The final landing navigation information instruction is as follows:
Figure BDA0003301102520000126
4. and controlling the unmanned aerial vehicle to land under the guidance of the landing navigation information.
After the navigation information instruction is calculated, the instruction is sent to the flight control computer through the serial port to serve as an expected value of speed control, the flight control computer tracks the speed, and the landing speed of the corresponding position is controlled according to the expected value of the speed indicated by the landing navigation information instruction.
In the embodiment of the invention, a response step of abnormal conditions is also designed.
5. Abnormal situation handling
In the unmanned aerial vehicle landing process, if the target (landing platform) is lost, the unmanned aerial vehicle is pulled upwards in situ to search for the target again, and the unmanned aerial vehicle is of a more conventional design. If the target still can not be obtained, taking the unmanned aerial vehicle task starting point (landing point) as the circle center, performing circular flight with the radius r, and searching the target. And if the target still cannot be acquired, forcibly entering the autonomous landing mode. In addition, if the flying height of the unmanned aerial vehicle is too low and is smaller than the safe landing threshold value in the process of searching the target, the unmanned aerial vehicle also enters the autonomous landing mode by force. The so-called autonomous landing mode is a landing with a uniform deceleration in the vertical direction, or a landing at a speed of 0.1m/s, i.e., a landing in three stages is not performed.
Example two
The embodiment discloses another method for guiding the unmanned aerial vehicle to land based on a multi-size nested pattern, which is substantially the same as the method of the first embodiment, and the only difference is that for the design of the unique identifier of each two-dimensional code, link addresses are respectively configured for each two-dimensional code instead of being designed by color blocks, and the link addresses point to the unique identifier.
EXAMPLE III
The unmanned aerial vehicle is provided with a camera, a radar, an IMU (inertial navigation unit), a GPS (global positioning unit), a positioning navigation unit, an onboard computer, a flight control computer and the like, wherein the onboard computer adopts NVIDIANX, a general micro unmanned aerial vehicle communication protocol (mav _ link) is used for communicating with the flight control computer through a serial port (mavros), and a flight instruction (such as takeoff navigation information, planning navigation information, landing navigation information and the like) is sent to the flight control computer through the serial port, so that autonomous flight and landing are realized.
The camera, the onboard computer and the sub-control computer on the unmanned aerial vehicle and the landing platform with multi-size nested patterns are designed to form a system for guiding the unmanned aerial vehicle to land.
The unmanned aerial vehicle normally flies according to the planning navigation information, and in the normal flying process, the camera is closed, and the flight of the unmanned aerial vehicle is guided by Real-time kinematic (RTK). And after the unmanned aerial vehicle flies to an effective area above a landing point or receives a landing instruction, starting a camera, collecting images below the camera and transmitting the images to an onboard computer for target detection.
The effective area above the landing point is a preset height above the landing point, the height h of the unmanned aerial vehicle is comprehensively determined by internal parameters of a camera and the size of a landing plate, and as shown in fig. 1, h is calculated by a pinhole imaging model to obtain:
Figure BDA0003301102520000141
wherein: f is the focal length of the camera, X' is the maximum value of the X axis of the imaging plane, and X is the maximum value of the X axis of the landing platform.
The airborne computer identifies the landing platform from the captured images. It has been said previously that the landing platform is identified by a multi-sized, nested design of two-dimensional codes. And when the airborne computer detects the two-dimensional code, the landing platform is identified. In the landing process of the unmanned aerial vehicle, the position of the landing platform needs to be continuously determined by acquiring images for many times to track the landing platform, as shown in fig. 2, a detection process of the landing platform is shown, wherein the acquired images are video streams, the landing platform is detected in a current frame of the video streams, and if the landing platform is not detected, target detection is continuously performed on a next frame; if the landing platform is detected, target tracking is performed on the landing platform in the subsequent frame, and when the target is lost (namely tracking fails), namely the landing platform is not detected from a certain frame, the landing platform is detected again from the next frame of the lost target, so that the process is repeated. And for the frame image successfully tracked, carrying out two-dimensional code positioning identification on the acquired image, carrying out one-time position calculation after the two-dimensional code is identified to obtain height data, and then re-executing the process to calculate new height data.
When the unmanned aerial vehicle is far enough away from the landing platform, the main two-dimensional code can be automatically selected and identified; with the distance being reduced, the visual field is reduced, and the first secondary two-dimensional code can be automatically selected and identified; and when the unmanned aerial vehicle position takes place the skew, can only "see" the part that descends the platform, then can automatic selection discernment secondary two-dimensional code this moment. As for how to realize the automatic selection and identification of the main two-dimensional code instead of the first secondary two-dimensional code or the second secondary two-dimensional code in the same frame, the main two-dimensional code can be set to be the highest priority, the first secondary two-dimensional code is the second highest priority, and the second secondary two-dimensional code is the lowest priority by setting the identification priorities of the main two-dimensional code, the first secondary two-dimensional code and the second secondary two-dimensional code in the onboard computer. In some embodiments, the uniqueness of each two-dimensional code is determined by configuring a one-to-one corresponding identifier for each primary two-dimensional code, and each secondary two-dimensional code, and the priority of the identifier of each two-dimensional code is set, that is, the setting of the identification priority of each two-dimensional code is realized. In one embodiment, each two-dimensional code is linked to a respective identity. In another specific embodiment, each two-dimensional code is considered to be formed by arranging a plurality of color block arrays, the value of each color block (black/white) is identified, for example, the black is 1, the white is 0, and the sequentially identified color block values form a character string as the ID number (i.e., identification) of the corresponding two-dimensional code. And after the on-board computer identifies the ID number according to the acquired image, the ID number is matched, and the two-dimensional code corresponding to the ID number with the highest priority in the picture is taken as an identification target.
The airborne computer calculates the relative position and orientation data between the unmanned aerial vehicle and the landing platform according to the detected landing platform:
if the side length of each two-dimensional code is known, the position relationship between each non-main two-dimensional code and the main two-dimensional code can be simply deduced and can be configured as a known parameter. When unmanned aerial vehicle discernment non-main two-dimensional code, according to the position relation between each non-main two-dimensional code and the main two-dimensional code, can determine the position of this non-main two-dimensional code on descending platform. Still taking the example of designing four peripheral secondary two-dimensional codes in the above embodiment as an example, the main two-dimensional code is designed in the center of the landing platform, the center of the main two-dimensional code is taken as the origin of the coordinate system, and the coordinates of the corner point coordinates of each secondary two-dimensional code in the coordinate system are respectively distributed in four quadrants. Taking the third two-dimensional code 3 as an example, the corner point coordinates of the upper left corner of the third two-dimensional code 3 satisfy the following relation in a coordinate system with the landing platform as the origin, and the corner point coordinates of the other fourth to sixth two-dimensional codes are obtained by the same principle:
Figure BDA0003301102520000161
wherein: ll0- -side length of two-dimensional code 1 with four large outer periphery in middle and outer
The pinhole camera module shown in fig. 1 also illustrates the imaging principle of the camera. Based on the principle and the related theory, the corresponding relation between the three-dimensional coordinates of the point in the space where the landing platform is located and the pixel coordinates of the acquired image can be obtained. Taking the second secondary two-dimensional code corner point (x, y, z) as an example, the corner point is converted into a camera coordinate system through an external reference matrix [ R | t ] and then projected to an image plane through a projection model (camera internal reference matrix K). It has been stated above that the coordinates of the corner points are known quantities, the camera internal reference matrix K can be obtained by calibrating the camera, the pixel coordinates (u, v) of the corner points on the image are also known quantities, and the P4P equation is constructed and solved to calculate the relative position and relative attitude of the camera and the ground marker, as shown in fig. 4, so as to obtain the relative position and attitude [ R | t ] of the camera (i.e., the drone) relative to the landing plate.
Figure BDA0003301102520000162
Wherein: (x, y and z) are coordinates of angular points of the second secondary two-dimensional code, (u and v) are pixel coordinates of the angular points on the image, R is rotation amount of the attitude, and t is translation amount.
Pc is known as R Po + t, Pc is a point of the camera coordinate system, Po is a point value of the world coordinate system, and R, t is a relative reference of the world coordinate system and the camera coordinate system. When Po ═ 0; 0; and 0, Pc is t, namely the coordinate of the world coordinate origin (the center of the two-dimensional code) in the camera coordinate system is t.
In actual flight, in order to reduce interference caused by factors such as vibration and the like, the angle of the camera is fixed and fixedly connected to the four rotors, so that the relative position and posture of the four rotors relative to the ground marker can be solved through coordinate transformation of the coordinate of the camera in a body coordinate system.
And the airborne computer also calculates landing navigation information according to the relative pose data.
In order to make unmanned aerial vehicle smooth flight, unmanned aerial vehicle reachs the landing point, after beginning to land, wholly divide into three stage: initial acceleration: by the current speed v of the dronecAcceleration to a specified speed v0(ii) a A uniform speed stage: hold v0Flying at a constant speed; end deceleration: from velocity v0And starting the deceleration flight, and reducing the speed to 0 to just fall to the landing platform.
The terminal deceleration of the fall should be such that:
Figure BDA0003301102520000171
the uniform velocity section satisfies:
s2=v0t2
an initial acceleration stage:
Figure BDA0003301102520000172
the whole should satisfy:
s=s1+s2+s3
v0≤vmax
a≤amax
wherein: a ismaxMaximum acceleration of the drone, vmaxMaximum speed, v, of the drone0For the speed of the unmanned aerial vehicle flying at a uniform speed, vcIs the speed when the unmanned aerial vehicle begins to land, a is the acceleration when the unmanned aerial vehicle changes speed, t1、t2、t3The time of three stages of terminal deceleration, uniform speed and initial acceleration is s1、s2、s3The flight mileage corresponds to three stages of terminal deceleration, constant speed and initial acceleration, and S is the total flight mileage of the unmanned aerial vehicle.
Through the mathematical relation, the expected speed of the unmanned aerial vehicle at the corresponding position in the landing process can be calculated, and corresponding landing navigation information is formed from the expected speed
Figure BDA0003301102520000173
Figure BDA0003301102520000174
For three axes of desired velocity, where v0、vc、a、s、amax、vmaxIs a pre-designed known quantity.
If the landing object is a moving object and the moving speed is
Figure BDA0003301102520000181
The final landing navigation information instruction is as follows:
Figure BDA0003301102520000182
and the sub-control computer controls the unmanned aerial vehicle to land according to the landing navigation information calculated by the onboard computer.
The bottom layer control of the unmanned aerial vehicle is executed by a flight control computer (px4), after the onboard computer calculates the navigation information instruction, the onboard computer sends the instruction to the flight control computer through a serial port to be used as an expected value of speed control, the flight control computer tracks the speed, and the landing speed of the corresponding position is controlled according to the expected value of the speed indicated by the landing navigation information instruction.
Further, an emergency processing program is configured on the onboard computer, and the emergency processing program is operated to send instructions to the flight control computer.
In the landing process of the unmanned aerial vehicle, if the target (landing platform) is lost, the flight control computer controls the unmanned aerial vehicle to pull upwards in situ according to the instruction, and searches the target again, and the target is of a more conventional design. And if the target still cannot be obtained, the flight control computer controls the unmanned aerial vehicle to carry out circular flight with the radius r by taking the unmanned aerial vehicle task starting point as the circle center according to the instruction, and the target is searched. And if the target still cannot be acquired, the on-board computer is forcibly used for controlling the sub-control computer to enter an autonomous landing mode. In addition, if the flying height of the unmanned aerial vehicle is too low and is smaller than the safe landing threshold value in the process of searching the target, the airborne computer controls the branch control computer to forcibly enter an autonomous landing mode. The so-called autonomous landing mode is a landing with a uniform deceleration in the vertical direction, for example, a landing at a speed of 0.1m/s, that is, a landing in three stages is not performed.
Example four
The embodiment discloses a system for guiding an unmanned aerial vehicle to land accurately based on two-dimensional codes, and the system is carried on the unmanned aerial vehicle. It should be noted that the two-dimensional code in this embodiment can be replaced by other computer readable patterns.
Firstly, it should be noted that the unmanned aerial vehicle is equipped with a camera, a radar, an IMU (inertial navigation unit), a GPS (global positioning unit), a positioning navigation unit, an onboard computer, a flight control computer, etc., the onboard computer adopts NVIDIANX, communicates with the flight control computer through a serial port (mavros) by using a general micro unmanned aerial vehicle communication protocol (mav _ link), and sends flight instructions (such as takeoff navigation information, planning navigation information, landing navigation information, etc.) to the flight control computer through the serial port, thereby realizing autonomous flight and landing.
The system further comprises an identification image formed by a plurality of two-dimensional codes arranged on the landing platform, and different from a conventional single two-dimensional code, the identification image in the embodiment comprises a plurality of multi-size nested two-dimensional codes, a first secondary two-dimensional code is nested inside a main two-dimensional code, a plurality of second secondary two-dimensional codes are circumferentially designed around the main two-dimensional code, and preferably 4 or more second secondary two-dimensional codes are uniformly distributed along the circumferential direction. The size of the main two-dimensional code is much larger than that of the first secondary two-dimensional code and that of the second secondary two-dimensional code. As shown in fig. 3, in one embodiment, a first two-dimensional code 1 is used as a main two-dimensional code, a second two-dimensional code 2 is arranged in the center of the main two-dimensional code and is used as a first secondary two-dimensional code, and 4 second secondary two-dimensional codes, namely a third two-dimensional code 3, a fourth two-dimensional code 4, a fifth two-dimensional code 5 and a sixth two-dimensional code 6, are respectively designed on four corners of the first two-dimensional code 1.
When the unmanned aerial vehicle is far enough away from the landing platform, all the two-dimensional codes can be shot; with the distance being reduced, the visual field is reduced, and only the first secondary two-dimensional code can be seen; and when the unmanned aerial vehicle position takes place the skew, then can "see" certain secondary two-dimensional code on descending platform's part.
The airborne computer comprises a target identification part, a pose calculation part and a navigation planning part. Specifically, the method comprises the following steps:
the target recognition section recognizes the landing platform from the image captured by the camera.
In the foregoing, the landing platform is provided with the two-dimensional code image, and the two-dimensional code image includes the main two-dimensional code, the first secondary two-dimensional code, and the second secondary two-dimensional code. In the target recognition unit, recognition priorities are previously assigned to the respective two-dimensional codes, wherein the primary two-dimensional code is set to have the highest priority, the secondary two-dimensional code of the first secondary two-dimensional code, and the secondary two-dimensional code of the second secondary two-dimensional code has the lowest priority. In some embodiments, the uniqueness of each two-dimensional code is determined by configuring a one-to-one corresponding identifier for each primary two-dimensional code, and each secondary two-dimensional code, and the priority of the identifier of each two-dimensional code is set, that is, the setting of the identification priority of each two-dimensional code is realized. In one embodiment, each two-dimensional code is considered to be formed by arranging a plurality of color block arrays, the identification result of black is set to be 1 and white is set to be 0 in each color block (black/white), when the two-dimensional code is identified, the color block values identified in sequence (such as horizontal identification sequence) form a character string as an ID number (i.e. identification) of the corresponding two-dimensional code, and the identification priority of each two-dimensional code is set by setting priority to the ID number.
The target recognition part takes the recognized two-dimension code with the highest priority as a recognition target in the same picture, namely, the main two-dimension code is taken as the recognition target and other two-dimension codes are not recognized as long as the image of the main two-dimension code exists, otherwise, the first secondary two-dimension code is taken as the recognition target or the second secondary two-dimension code is taken as the recognition target. Specifically, when the identifiers of the two-dimensional codes are identified, the two-dimensional code corresponding to the identifier with the highest priority is matched in the configured priority table to serve as the identification target. For example, in the above design, when a plurality of two-dimensional codes are found in the same frame, the ID numbers of the two-dimensional codes are obtained by the color block values, the priorities of the two-dimensional codes are obtained by matching the ID numbers in the priority table, and the two-dimensional code with the highest priority is used as the identification target.
The pose calculation part identifies the landing platform according to the target identification part and calculates relative pose data between the unmanned aerial vehicle and the landing platform.
In the foregoing embodiment, it has been stated that the relative position between each non-master two-dimensional code in the two-dimensional code image and the master two-dimensional code is known, the master two-dimensional code is designed at the central position of the landing platform (of course, it may not be at the central position, which has no influence on the execution of the algorithm), and the coordinates of each non-master two-dimensional code in the coordinate system can also be taken as a known quantity by taking the central position of the landing platform as the origin of the coordinate system.
According to the imaging principle of the pinhole camera module shown in fig. 1, the correspondence between the three-dimensional coordinates of the point in the space where the landing platform is located and the pixel coordinates on the acquired image can be obtained. Taking the second secondary two-dimensional code corner point (x, y, z) as an example, the corner point is converted into a camera coordinate system through an external reference matrix [ R | t ] and then projected to an image plane through a projection model (camera internal reference matrix K). It has been stated above that the coordinates of the corner points are known quantities, the camera internal reference matrix K can be obtained by calibrating the camera, the pixel coordinates (u, v) of the corner points on the image are also known quantities, and the P4P equation is constructed and solved to calculate the relative position and relative attitude of the camera and the ground marker, as shown in fig. 4, so as to obtain the relative position and attitude [ R | t ] of the camera (i.e., the drone) relative to the landing plate.
Figure BDA0003301102520000211
Wherein: (x, y and z) are coordinates of angular points of the second secondary two-dimensional code, (u and v) are pixel coordinates of the angular points on the image, R is rotation amount of the attitude, and t is translation amount.
Pc is known as R Po + t, Pc is a point of the camera coordinate system, Po is a point value of the world coordinate system, and R, t is a relative reference of the world coordinate system and the camera coordinate system. When Po ═ 0; 0; and 0, Pc is t, namely the coordinate of the world coordinate origin (the center of the two-dimensional code) in the camera coordinate system is t.
In actual flight, in order to reduce interference caused by factors such as vibration and the like, the angle of the camera is fixed and fixedly connected to the four rotors, so that the relative position and posture of the four rotors relative to the ground marker can be solved through coordinate transformation of the coordinate of the camera in a body coordinate system.
The navigation planning unit calculates landing navigation information based on the relative pose data calculated by the pose calculation unit.
In order to make unmanned aerial vehicle smooth flight, unmanned aerial vehicle reachs the landing point, after beginning to land, wholly divide into three stage: initial acceleration: by the current speed v of the dronecAcceleration to a specified speed v0(ii) a A uniform speed stage: hold v0Flying at a constant speed; end deceleration: from velocity v0And starting the deceleration flight, and reducing the speed to 0 to just fall to the landing platform.
The terminal deceleration of the fall should be such that:
Figure BDA0003301102520000212
the uniform velocity section satisfies:
s2=v0t2
an initial acceleration stage:
Figure BDA0003301102520000221
the whole should satisfy:
s=s1+s2+s3
v0≤vmax
a≤amax
wherein: a ismaxMaximum acceleration of the drone, vmaxMaximum speed, v, of the drone0For the speed of the unmanned aerial vehicle flying at a uniform speed, vcIs the speed when the unmanned aerial vehicle begins to land, a is the acceleration when the unmanned aerial vehicle changes speed, t1、t2、t3The time of three stages of terminal deceleration, uniform speed and initial acceleration is s1、s2、s3The flight mileage corresponds to three stages of terminal deceleration, constant speed and initial acceleration, and S is the total flight mileage of the unmanned aerial vehicle.
Through the mathematical relation, the expected speed of the unmanned aerial vehicle at the corresponding position in the landing process can be calculated, and corresponding landing navigation information instructions are formed
Figure BDA0003301102520000222
Figure BDA0003301102520000223
For three axes of desired velocity, where v0、vc、a、s、amax、vmaxIs a pre-designed known quantity.
If the landing object is a moving object and the moving speed is
Figure BDA0003301102520000224
The final landing navigation information instruction is as follows:
Figure BDA0003301102520000225
and the airborne computer sends the calculated landing navigation information to the flight control computer through a serial port, and the flight control computer controls the landing speed of the corresponding position according to the speed expectation value indicated by the landing navigation information instruction.
EXAMPLE five
The invention also discloses an implementation mode of the first embodiment in ROS software.
As shown in fig. 5, the usb _ cam node acquires image information through the camera and issues a topic usb _ cam/image _ raw, the visual recognition algorithm subscribes the topic usb _ cam/image _ raw, performs detection and recognition and calculates the pose relationship of the landing platform relative to the unmanned aerial vehicle, and issues a detection result object _ detection/landload _ det, the autonomous landing algorithm performs coordinate conversion and calculates landing navigation information by subscribing the topic object _ detection/landload _ det, and issues a control instruction topic: and the/control _ command is sent to the flight control computer through mav _ ros, and the flight control computer completes tracking control.
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed.

Claims (15)

1. The utility model provides an unmanned aerial vehicle landing platform, landing platform central authorities are provided with the identification image, its characterized in that, the identification image includes the pattern of many sizes, nested formula design.
2. An unmanned aerial vehicle landing platform as in claim 1, wherein the identification image comprises a plurality of patterns, wherein the plurality of patterns comprise a primary pattern, wherein a first secondary pattern is nested within the primary pattern, and wherein a plurality of second secondary patterns are circumferentially disposed about the primary pattern.
3. A landing platform identification method comprises the step of identifying a landing platform from a collected image, and is characterized in that the step of identifying the landing platform comprises the process of identifying a marking image of a multi-size nested pattern design in the image.
4. A landing platform recognition method as claimed in claim 3, wherein the identification image comprises a plurality of patterns, and each pattern in the identification image is provided with a recognition priority, and the associated information of the pattern with the highest priority is recognized as the recognition result of the landing platform.
5. A landing platform identification method as in claim 4, wherein the pattern in the identification image is designed with at least three levels of identification priority; the highest priority identifies a primary pattern centrally located within the identification image, the second priority identifies a first secondary pattern nested within the primary pattern, and the third priority identifies a second secondary pattern circumferentially distributed around the primary pattern.
6. A landing platform identification method according to claim 4 or claim 5, wherein each pattern is provided with a unique identifier, and the setting of the identification priority for each pattern in the identification image is achieved by providing an identification priority for each pattern's identifier.
7. An unmanned aerial vehicle landing method, comprising:
collecting an image;
a step of identifying a landing platform from the captured image by using a landing platform identification method according to any one of claims 3 to 6;
calculating relative pose data between the unmanned aerial vehicle and the detected landing platform;
calculating landing navigation information according to the relative pose data; and
and controlling the unmanned aerial vehicle to land under the guidance of the landing navigation information.
8. An unmanned aerial vehicle landing method as in claim 7, wherein the landing navigation information includes information guiding an initial acceleration phase, a uniform velocity phase, and a final deceleration phase of the unmanned aerial vehicle landing process, wherein:
an initial acceleration stage: accelerating the current speed of the unmanned aerial vehicle to a specified speed;
a uniform speed stage: keeping the uniform flying;
and a terminal deceleration stage: decelerating from the specified speed to zero and just landing to a landing platform when the speed decelerates to zero.
9. A method of landing a drone according to claim 7, wherein the image is acquired when the drone is located within a predetermined height h above the landing point, the predetermined height h being determined by:
Figure FDA0003301102510000021
in the formula, f is the focal length of the camera, X' is the maximum value of the X axis of the imaging plane, and X is the maximum value of the X axis of the landing platform.
10. An unmanned aerial vehicle landing method as in claim 7, further comprising an abnormal situation handling step of:
if the landing platform is not identified from the acquired images, pulling up the unmanned aerial vehicle in situ to search for the landing platform again; if the landing platform is not searched after being pulled to the preset height, the unmanned aerial vehicle is controlled to take the landing point as the circle center to carry out circular flying search of the radius r to the landing platform, and if the landing platform is not searched yet, the unmanned aerial vehicle is controlled to land in an autonomous landing mode;
if the height of the unmanned aerial vehicle is lower than a set threshold value in the process of searching the landing platform, the unmanned aerial vehicle is controlled to land in an autonomous landing mode.
11. A flight operations system, comprising at least one drone and a drone landing platform according to claim 1 or 2.
12. A flight operations system as claimed in claim 11, wherein each drone includes a camera, an onboard computer and a flight control computer connected in series, wherein:
the camera is configured to: collecting an image;
designing recognition priority for the patterns in the identification image by the onboard computer; the on-board computer is configured to: carrying out pattern recognition on the collected image, taking the associated information of the recognized pattern with the highest priority as a detection result of the landing platform, calculating relative pose information between the unmanned aerial vehicle and the landing platform, and calculating landing navigation information according to the relative pose information;
the flight control computer is configured to: and controlling the unmanned aerial vehicle to land according to the landing navigation information.
13. The flight operations system of claim 12, wherein the landing navigation information includes information that guides the drone to land in an initial acceleration phase, a uniform velocity phase, and a final deceleration phase of a process, wherein:
an initial acceleration stage: accelerating the current speed of the unmanned aerial vehicle to a specified speed;
a uniform speed stage: keeping the uniform flying;
and a terminal deceleration stage: decelerating from the specified speed to zero and just landing to a landing platform when the speed decelerates to zero.
14. A flight operations system according to claim 12, wherein the on-board computer triggers the camera to capture an image when the drone is located within a predetermined height h above the landing point, the predetermined height h being determined by:
Figure FDA0003301102510000031
in the formula, f is the focal length of the camera, X' is the maximum value of the X axis of the imaging plane, and X is the maximum value of the X axis of the landing platform.
15. A flight operations system according to claim 12, wherein an abnormal situation handling response action is configured in the on-board computer; the on-board computer is configured to: if the landing platform is not identified from the acquired images, sending an instruction to a flight control computer to control the unmanned aerial vehicle to be pulled up in situ and control the camera to acquire images again; if the landing platform cannot be identified from the reacquired image after being lifted to the preset height, sending an instruction to a flight control computer to control the unmanned aerial vehicle to take the landing point as the circle center and carry out circular flight with the radius r, and controlling a camera to reacquire the image; if the landing platform cannot be identified from the re-acquired image, sending an instruction to a flight control computer to control the unmanned aerial vehicle to land in an autonomous landing mode;
the on-board computer is configured to: and in the process of executing the abnormal condition processing response action, if the height of the unmanned aerial vehicle is lower than a set threshold value, sending an instruction to the flight control computer to control the unmanned aerial vehicle to land in an autonomous landing mode.
CN202111190817.6A 2021-10-13 2021-10-13 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system Pending CN113759943A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111190817.6A CN113759943A (en) 2021-10-13 2021-10-13 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111190817.6A CN113759943A (en) 2021-10-13 2021-10-13 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system

Publications (1)

Publication Number Publication Date
CN113759943A true CN113759943A (en) 2021-12-07

Family

ID=78799374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111190817.6A Pending CN113759943A (en) 2021-10-13 2021-10-13 Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system

Country Status (1)

Country Link
CN (1) CN113759943A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489129A (en) * 2022-01-24 2022-05-13 北京远度互联科技有限公司 Unmanned aerial vehicle landing method and related device
CN115857519A (en) * 2023-02-14 2023-03-28 复亚智能科技(太仓)有限公司 Unmanned aerial vehicle curved surface platform autonomous landing method based on visual positioning
CN116578035A (en) * 2023-07-14 2023-08-11 南京理工大学 Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology
TWI822007B (en) * 2022-04-22 2023-11-11 藏識科技有限公司 Landing platform for aircraft and system thereof

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932522A (en) * 2015-05-27 2015-09-23 深圳市大疆创新科技有限公司 Autonomous landing method and system for aircraft
CN106502257A (en) * 2016-10-25 2017-03-15 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands jamproof control method
CN106527487A (en) * 2016-12-23 2017-03-22 北京理工大学 Autonomous precision landing system of unmanned aerial vehicle on motion platform and landing method
CN107450590A (en) * 2017-08-07 2017-12-08 深圳市科卫泰实业发展有限公司 A kind of unmanned plane auxiliary landing method
US20180237161A1 (en) * 2017-02-21 2018-08-23 Echostar Technologies L.L.C. Systems and methods for uav docking and recharging
CN108549397A (en) * 2018-04-19 2018-09-18 武汉大学 The unmanned plane Autonomous landing method and system assisted based on Quick Response Code and inertial navigation
CN108700890A (en) * 2017-06-12 2018-10-23 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage control method, unmanned plane and machine readable storage medium
CN108873943A (en) * 2018-07-20 2018-11-23 南京奇蛙智能科技有限公司 A kind of image processing method that unmanned plane Centimeter Level is precisely landed
CN109658461A (en) * 2018-12-24 2019-04-19 中国电子科技集团公司第二十研究所 A kind of unmanned plane localization method of the cooperation two dimensional code based on virtual simulation environment
CN109823552A (en) * 2019-02-14 2019-05-31 深圳市多翼创新科技有限公司 The unmanned plane precision approach method of view-based access control model, storage medium, apparatus and system
CN109839822A (en) * 2019-02-27 2019-06-04 中国人民解放军火箭军工程大学 A kind of quadrotor drone height control method improving active disturbance rejection
CN110488848A (en) * 2019-08-23 2019-11-22 中国航空无线电电子研究所 Unmanned plane vision guide it is autonomous drop method and system
CN110989661A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN111176331A (en) * 2020-03-12 2020-05-19 江苏蓝鲸智慧空间研究院有限公司 Precise landing control method for unmanned aerial vehicle
CN111221343A (en) * 2019-11-22 2020-06-02 西安君晖航空科技有限公司 Unmanned aerial vehicle landing method based on embedded two-dimensional code
CN111332485A (en) * 2018-12-19 2020-06-26 福特全球技术公司 Accurate unmanned aerial vehicle landing system and method based on thermal image
CN111506091A (en) * 2020-05-07 2020-08-07 山东力阳智能科技有限公司 Unmanned aerial vehicle accurate landing control system and method based on dynamic two-dimensional code
CN111930146A (en) * 2020-08-25 2020-11-13 上海比茵沃汽车电子有限公司 Vehicle-mounted unmanned aerial vehicle accurate landing recognition method
CN112650298A (en) * 2020-12-30 2021-04-13 广东工业大学 Unmanned aerial vehicle tracking landing method and system
CN113129341A (en) * 2021-04-20 2021-07-16 广东工业大学 Landing tracking control method and system based on light-weight twin network and unmanned aerial vehicle
CN113361674A (en) * 2021-06-04 2021-09-07 重庆邮电大学 Encoding and decoding method of nested guide two-dimensional code

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932522A (en) * 2015-05-27 2015-09-23 深圳市大疆创新科技有限公司 Autonomous landing method and system for aircraft
CN106502257A (en) * 2016-10-25 2017-03-15 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands jamproof control method
CN106527487A (en) * 2016-12-23 2017-03-22 北京理工大学 Autonomous precision landing system of unmanned aerial vehicle on motion platform and landing method
US20180237161A1 (en) * 2017-02-21 2018-08-23 Echostar Technologies L.L.C. Systems and methods for uav docking and recharging
CN108700890A (en) * 2017-06-12 2018-10-23 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage control method, unmanned plane and machine readable storage medium
CN107450590A (en) * 2017-08-07 2017-12-08 深圳市科卫泰实业发展有限公司 A kind of unmanned plane auxiliary landing method
CN108549397A (en) * 2018-04-19 2018-09-18 武汉大学 The unmanned plane Autonomous landing method and system assisted based on Quick Response Code and inertial navigation
CN108873943A (en) * 2018-07-20 2018-11-23 南京奇蛙智能科技有限公司 A kind of image processing method that unmanned plane Centimeter Level is precisely landed
CN111332485A (en) * 2018-12-19 2020-06-26 福特全球技术公司 Accurate unmanned aerial vehicle landing system and method based on thermal image
CN109658461A (en) * 2018-12-24 2019-04-19 中国电子科技集团公司第二十研究所 A kind of unmanned plane localization method of the cooperation two dimensional code based on virtual simulation environment
CN109823552A (en) * 2019-02-14 2019-05-31 深圳市多翼创新科技有限公司 The unmanned plane precision approach method of view-based access control model, storage medium, apparatus and system
CN109839822A (en) * 2019-02-27 2019-06-04 中国人民解放军火箭军工程大学 A kind of quadrotor drone height control method improving active disturbance rejection
CN110488848A (en) * 2019-08-23 2019-11-22 中国航空无线电电子研究所 Unmanned plane vision guide it is autonomous drop method and system
CN110989661A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN111221343A (en) * 2019-11-22 2020-06-02 西安君晖航空科技有限公司 Unmanned aerial vehicle landing method based on embedded two-dimensional code
CN111176331A (en) * 2020-03-12 2020-05-19 江苏蓝鲸智慧空间研究院有限公司 Precise landing control method for unmanned aerial vehicle
CN111506091A (en) * 2020-05-07 2020-08-07 山东力阳智能科技有限公司 Unmanned aerial vehicle accurate landing control system and method based on dynamic two-dimensional code
CN111930146A (en) * 2020-08-25 2020-11-13 上海比茵沃汽车电子有限公司 Vehicle-mounted unmanned aerial vehicle accurate landing recognition method
CN112650298A (en) * 2020-12-30 2021-04-13 广东工业大学 Unmanned aerial vehicle tracking landing method and system
CN113129341A (en) * 2021-04-20 2021-07-16 广东工业大学 Landing tracking control method and system based on light-weight twin network and unmanned aerial vehicle
CN113361674A (en) * 2021-06-04 2021-09-07 重庆邮电大学 Encoding and decoding method of nested guide two-dimensional code

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489129A (en) * 2022-01-24 2022-05-13 北京远度互联科技有限公司 Unmanned aerial vehicle landing method and related device
TWI822007B (en) * 2022-04-22 2023-11-11 藏識科技有限公司 Landing platform for aircraft and system thereof
CN115857519A (en) * 2023-02-14 2023-03-28 复亚智能科技(太仓)有限公司 Unmanned aerial vehicle curved surface platform autonomous landing method based on visual positioning
CN116578035A (en) * 2023-07-14 2023-08-11 南京理工大学 Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology

Similar Documents

Publication Publication Date Title
CN113759943A (en) Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system
CN109992006B (en) A kind of accurate recovery method and system of power patrol unmanned machine
US11604479B2 (en) Methods and system for vision-based landing
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
CN108227751B (en) Landing method and system of unmanned aerial vehicle
CN107240063B (en) Autonomous take-off and landing method of rotor unmanned aerial vehicle facing mobile platform
CN106774386B (en) Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN106647814A (en) System and method of unmanned aerial vehicle visual sense assistant position and flight control based on two-dimensional landmark identification
CN105759829A (en) Laser radar-based mini-sized unmanned plane control method and system
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN106502257A (en) A kind of unmanned plane precisely lands jamproof control method
CN112783181B (en) Multi-rotor unmanned aerial vehicle cluster vision landing method based on fuzzy control
CN109839945A (en) Unmanned plane landing method, unmanned plane landing-gear and computer readable storage medium
CN116578035A (en) Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
JP6791387B2 (en) Aircraft, air vehicle control device, air vehicle control method and air vehicle control program
Wubben et al. A vision-based system for autonomous vertical landing of unmanned aerial vehicles
CN112947569A (en) Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN115755575A (en) ROS-based double-tripod-head unmanned aerial vehicle autonomous landing method
Lee et al. Safe landing of drone using AI-based obstacle avoidance
CN115686043A (en) Fixed-wing aircraft and air docking method of rotor aircraft
CN108319287A (en) A kind of UAV Intelligent hides the system and method for flying object
CN112904895B (en) Image-based airplane guiding method and device
CN112198894B (en) Autonomous moving landing guidance method and system for rotor unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211207

RJ01 Rejection of invention patent application after publication