CN103809598A - Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground - Google Patents

Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground Download PDF

Info

Publication number
CN103809598A
CN103809598A CN201410089860.7A CN201410089860A CN103809598A CN 103809598 A CN103809598 A CN 103809598A CN 201410089860 A CN201410089860 A CN 201410089860A CN 103809598 A CN103809598 A CN 103809598A
Authority
CN
China
Prior art keywords
landing
information
ground
system
unmanned plane
Prior art date
Application number
CN201410089860.7A
Other languages
Chinese (zh)
Other versions
CN103809598B (en
Inventor
雷旭升
陆培
邹莹
郭克信
张霄
Original Assignee
北京航空航天大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京航空航天大学 filed Critical 北京航空航天大学
Priority to CN201410089860.7A priority Critical patent/CN103809598B/en
Publication of CN103809598A publication Critical patent/CN103809598A/en
Application granted granted Critical
Publication of CN103809598B publication Critical patent/CN103809598B/en

Links

Abstract

Disclosed is a rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground. The system comprises a small rotor unmanned aircraft (SRUA), an onboard sensor, a data processing unit, a flight control system, an onboard camera, landing ground, a wireless image transmission module, a wireless data transmission module and a ground monitor station, wherein the onboard sensor comprises an inertial measurement unit, a global positioning system (GPS) receiver, a barometer, an ultrasound device and the like, the data processing unit is used for integrating sensor data, the flight control system finishes route planning to achieve high accuracy control over the SRUA, the onboard camera is used for collecting images of the landing ground, the landing ground is a specially designed landing point for an unmanned aircraft, the wireless image transmission module can transmit the images to a ground station, the wireless data transmission module can achieve communication of data and instructions between the unmanned aircraft and the ground station, and the ground station is composed of a visual processing unit and a display terminal. According to the rotor unmanned aircraft independent take-off and landing system based on the three-layer triangle multi-color landing ground, the reliability of SRUA navigation messages is guaranteed, the control accuracy of the SRUA is increased, costs are low, the application is convenient, and important engineering values can be achieved.

Description

The autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp

Technical field

The autonomous landing system of unmanned plane that the present invention relates to a kind of vision navigation system, can be used for small-sized unmanned aircraft independent navigation and control, is specially adapted to the military and civilian field that unmanned plane landing place is fixed, navigation accuracy is had relatively high expectations.

Background technology

SRUA has the advantages such as size is little, cost is low, maneuverability, can realize the functions such as the landing of taking off vertically, hovering, hedgehopping, and aspect military, civilian and scientific research, application prospect is very extensive.

SRUA takes off very deep with the research of airmanship, and has obtained good result.But the high precision navigation control method in landing stage is still the focus of current research, domestic also in the starting stage.Accurate elevation information is the basis of realizing the autonomous landing of SRUA safety and stability, and the performance of system is had to material impact.Because SRUA Airborne Inertial element height direction is unstable, be subject to the restriction of its size, weight, cost etc., conventionally obtain elevation information accurately by the high module equal-volume of GPS (GPS), the ultrasonic survey device little and low in energy consumption of barometric altimeter, output positioning height.Barometric altimeter highly resolves with the rule of height change accordingly based on atmospheric pressure, and it is simple in structure and capacity of will is strong, but in unmanned plane landing process, rotor causes variations in flow, and therefore accuracy of measurement is difficult to guarantee.The advantages such as positional accuracy is high although gps signal has, error can not accumulate in time, but gps signal renewal frequency is slow, and be easily subject to external interference, especially the working environment of many buildingss in city, more easily there is eclipse phenomena, thereby affect measuring accuracy.And ultrasonicly can only provide accurate metrical information by the altitude range in 4.2 meters.

The single camera vision system of development comparative maturity at present, only needs a camera and a VPU, is landed and is indicated that vector aircraft arrives at landing point by detection.Existing landing ramp mark is mostly polygon combination, H type, the L-type, T-shaped or circular of regular shape, and its common ground is to be all designed to black and white with distinct contrast, so that can be terrestrial reference feature extraction out in the binary conversion treatment of image.Light, environment and camera quality etc. all can cause variation in various degree to black and white mark gray-scale value of corresponding pixel in image, black part gray-scale value is reduced, white portion gray-scale value raises, the two difference is dwindled, therefore, how landing ramp image is carried out to suitable Threshold segmentation processing, to extract, landing ramp characteristic information becomes extremely important and algorithm realization is more complicated.In addition, be subject to the restriction of visual beacon size, as too far away apart from landing ramp in unmanned plane, can make landing ramp information fuzzy and increase measuring error, too closely can cause part landing ramp pattern-information to run off the camera visual field and cause information dropout, therefore the scope of heading of vision system has certain restriction.

Summary of the invention

Technology of the present invention is dealt with problems and is: the deficiency that overcomes existing vision guided navigation technology, by means of three layers of isosceles triangle polychrome landing ramp of particular design, provide that a kind of precision is high, measurement range is wide, strong robustness, economical reliable, be easy to that the monocular vision realized is auxiliary surveys the high autonomous landing system of unmanned plane.

Technical solution of the present invention is: the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp, comprises data processing unit (3) on small-sized rotor wing unmanned aerial vehicle (1), airborne sensor (2), machine, flight control system (4), airborne camera (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9); It is upper that airborne sensor (2), data processing unit (3), flight control system (4) are loaded in respectively unmanned plane (1), and ground monitoring station (9) are made up of VPU (10) and display terminal (11), wherein:

Small-sized rotor wing unmanned aerial vehicle (1) is the carrier of airborne sensor (2), data processing unit (3) and flight control system (4), and it is this autonomous landing systematic research main body;

Airborne sensor (2) main little by some volumes, quality light, the sensor that meets microminiature unmanned plane load-carrying demand forms, as Inertial Measurement Unit (IMU), GPS (GPS) receiver, magnetic compass, barometric altimeter, ultrasonic etc.IMU independence is strong, short time precision is high, and what energy was continuous provides position, speed and attitude information, but error accumulates in time; GPS information has the whole world, round-the-clock, round-the-clock feature, but its renewal frequency is low, is subject to barrier and blocks; Magnetic compass can provide high-precision course angle information, but affected by external magnetic field; Barometric altimeter can provide elevation information for unmanned plane in the situation that affecting without downwash flow; The ultrasonic accurate measurement that can provide unmanned plane elevation information within region near the ground, but exceed after measurement range, ultrasound information is unavailable;

The hardware core of data processing unit on machine (3) is dsp processor, first the measurement information of each sensor is carried out to unruly-value rejecting, the pre-service such as coordinate is unified, secondly complete the fusion of multiple measurement informations, to provide the navigation information of high-precision high-reliability based on Kalman filtering etc.; Aspect elevation information, native system is according to the characteristic of various height sensors, design the elevation information Fusion Module based on self-adaptive weighted average, not only improve the precision of highly measuring, also can in the insecure situation of a certain height sensor information, self-adaptation carry out weights adjusting, guarantee the reliability of the elevation information in SRUA landing process;

Flight control system (4) completes path planning according to the navigation information of unmanned plane and mission requirements, and the method based on adaptive neural network completes high precision and stable landing control;

Airborne camera (5) is fixed on small-sized rotor wing unmanned aerial vehicle (1), and camera lens vertically downward, in unmanned plane landing process, camera is caught the image of landing ramp, ground (6) in real time, and is transferred to ground monitoring station (9) by wireless image transmission module (7);

Landing ramp, ground (6), as the visual beacon of rotor wing unmanned aerial vehicle, is made up of the multilayer isosceles triangle multicolor patterns of particular design, and its strikingly color and shape facility have improved the probability of terrestrial reference identification greatly;

Wireless image transmission module (7) is made up of image emissions end and receiving end, all needs 12V direct current supply, and carrier frequency is 1.2GHz, and diffracting power is stronger, and laser propagation effect is better.It is upper that transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving end is connected with ground monitoring station by video frequency collection card;

Wireless data transfer module (8) is made up of data transmission end and data receiver equally, all needs 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;

Ground monitoring station (9) mainly completes and monitors and control task, and it is mainly made up of VPU and display terminal.In order to alleviate airborne load, the landing ramp image that native system is caught airborne camera (5) by wireless image transmission module (7) sends to ground monitoring station (9), and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then pass through wireless data transfer module (8) and send to mobile system.Meanwhile, the information such as the position of the demonstration SRUA that ground display terminal can be real-time, attitude, speed, with facilitate ground staff according to mission requirements to unmanned plane sending controling instruction.

Principle of the present invention is: landing ramp is placed on to the pre-landing point of unmanned plane, its size and unique point are demarcated, typing VPU carries out vision system initialization.In unmanned plane landing process, catch in real time landing ramp image, and the Threshold segmentation that carries out image based on color characteristic is to isolate isosceles triangle terrestrial reference, and then based in advance demarcate isosceles triangle terrestrial reference Hu not bending moment value carry out secondary judgement, after twice filtering based on CF is marked in image definitely, extract the terrestrial reference characteristic information in image, can provide the height of the relative landing ramp of current unmanned plane in conjunction with the individual features information in world coordinate system.Vision measures height and carries out self-adaptive weighted average filtering with the measurement information of other height sensors in the same time, obtains the high-precision navigation information of unmanned plane.Elevation information is sent into control system reliably, to guide the landing of unmanned plane safety and stability.

The present invention's advantage is compared with prior art:

(1) measurement information that the present invention selects the filtering method of self-adaptive weighted average to carry out each height sensor merges, and according to the reliability of each height sensor, adjusts at any time the weights of each sensor, guarantees best syncretizing effect.Self-adaptive weighted average has not only guaranteed the measurement accuracy of entire system, simultaneously unreliable in a certain sensor measurement information, as GPS loses star, landing ramp is blocked and causes in the situation of visual information inefficacy, by its weights are independently upgraded, still can assurance system merge reliability highly.

(2) the present invention has utilized color body strikingly color feature to reduce environmental interference, simplified image Processing Algorithm; , disturb for fear of the homochromy information that may exist in physical environment, visual beacon is designed to the special isosceles triangle of shape meanwhile; The nested combination of the descending redness of size, blueness, green isosceles triangle pattern, all has sizeable isosceles triangle terrestrial reference in the image information that different height sections is caught, and has guaranteed the precision of omnidistance vision guided navigation.Landing ramp is easily manufactured, simple economy, and volume is little, and quality is light, is easy to carry.

Accompanying drawing explanation

Fig. 1 is structure composition frame chart of the present invention;

Fig. 2 is vision algorithm design flow diagram of the present invention;

Fig. 3 is self-adaptive weighted average filtering figure of the present invention;

Fig. 4 is communication schematic diagram of the present invention;

Embodiment

As shown in Figure 1, the present invention mainly comprises data processing unit (3) on small-sized rotor wing unmanned aerial vehicle (1), airborne sensor (2), machine, flight control system (4), airborne camera (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9); It is upper that airborne sensor (2), data processing unit (3), flight control system (4) are loaded in respectively unmanned plane (1), and ground monitoring station (9) are made up of VPU (10) and display terminal (11).

Small-sized rotor wing unmanned aerial vehicle (1) is the carrier of airborne sensor (2), data processing unit (3) and flight control system (4), and it is this autonomous landing systematic research main body;

Airborne sensor (2) main little by some volumes, quality light, the sensor that meets microminiature unmanned plane load-carrying demand forms, as Inertial Measurement Unit (IMU), GPS (GPS) receiver, magnetic compass, barometric altimeter, ultrasonic etc.IMU independence is strong, short time precision is high, and what energy was continuous provides position, speed and attitude information, but error accumulates in time; GPS information has the whole world, round-the-clock, round-the-clock feature, but its renewal frequency is low, is subject to barrier and blocks; Magnetic compass can provide high-precision course angle information, but affected by external magnetic field; Barometric altimeter can provide elevation information for unmanned plane in the situation that affecting without downwash flow; The ultrasonic accurate measurement that can provide unmanned plane elevation information within region near the ground, but exceed after measurement range, ultrasound information is unavailable; Therefore, native system selects the effective dose measurement information of differing heights sensor to merge stage by stage, selects above vision, barometric altimeter, GPS for 4.2 meters, selects below vision, ultrasonic, GPS to carry out measurement information fusion for 4.2 meters;

The hardware core of data processing unit on machine (3) is dsp processor, first the measurement information of each sensor is carried out to unruly-value rejecting, the pre-service such as coordinate is unified, secondly complete the fusion of multiple measurement informations, to provide the navigation information of high-precision high-reliability based on Kalman filtering etc.; Aspect elevation information, native system is according to the characteristic of various height sensors, design the elevation information Fusion Module based on self-adaptive weighted average, not only improve the precision of highly measuring, also can in the insecure situation of a certain height sensor information, self-adaptation carry out weights adjusting, guarantee the reliability of the elevation information in SRUA landing process;

Flight control system (4) completes path planning according to the navigation information of unmanned plane and mission requirements, and the method based on adaptive neural network completes high precision and stable landing control;

Airborne camera (5) is fixed on small-sized rotor wing unmanned aerial vehicle (1), and camera lens vertically downward, in unmanned plane landing process, camera is caught the image of landing ramp, ground (6) in real time, and is transferred to ground monitoring station (9) by wireless image transmission module (7);

Landing ramp, ground (6), as the visual beacon of rotor wing unmanned aerial vehicle, is made up of the multilayer isosceles triangle multicolor patterns of particular design, and its strikingly color and shape facility have improved the probability of terrestrial reference identification greatly;

Wireless image transmission module (7) is made up of image emissions end and receiving end, all needs 12V direct current supply, and carrier frequency is 1.2GHz, and diffracting power is stronger, and laser propagation effect is better.It is upper that transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving end is connected with ground monitoring station by video frequency collection card;

Wireless data transfer module (8) is made up of data transmission end and data receiver equally, all needs 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;

Ground monitoring station (9) mainly completes and monitors and control task, and it is mainly made up of VPU and display terminal.In order to alleviate airborne load, the landing ramp image that native system is caught airborne camera (5) by wireless image transmission module (7) sends to ground monitoring station (9), and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then pass through wireless data transfer module (8) and send to mobile system.Meanwhile, the information such as the position of the demonstration SRUA that ground display terminal can be real-time, attitude, speed, with facilitate ground staff according to mission requirements to unmanned plane sending controling instruction.

As shown in Figure 2, be vision algorithm design flow diagram of the present invention.Unmanned plane carries out before landing work, and vision processing system initialization is carried out according to three chromaticity font polychrome landing ramp sizes and camera internal reference in ground monitoring station.In figure, H prepresenting the height in a moment on unmanned plane, when system starts, for unmanned plane level point height, give an one guestimate value, is the height after Multi-sensor Fusion in landing process.Vision system captures after real-time landmark image, according to H pthe terrestrial reference pattern of which kind of color is extracted in value judgement, and then selects the image binaryzation method of respective color.When unmanned plane lands from more than 20 meters high-altitude, vision system selects maximum red isosceles triangle as visual beacon, works as H pbe less than H rBtime, be switched to blue mark, near the ground be H pbe less than H bGtime, be switched to minimum green mark.Wherein, H rBand H bGsize determined by the size of three kinds of isosceles triangle patterns in landing ramp and the focal length of camera, in the present invention, the value of the two is respectively 700cm and 180cm.Utilize Hu not bending moment algorithm calculate the Hu square Hu of the landmark image after binaryzation imgjudge whether at the Hu demarcating not in torque conversion range, after being marked on definitely in image, carry out terrestrial reference feature extraction, and select the height calculation method of respective color, obtain vision elevation information, carry out auto adapted filtering fusion through the revised visual information of attitude angle and airborne height sensor, obtain current fusion height value H.

As shown in Figure 3, for of the present invention for the self-adaptive weighted average filtration module of height metrical information.First, the elevation information measured to each sensor carries out pre-service, as same in coordinate unit, smothing filtering is to reject accidental error, then take 4.2 meters as high/low empty separatrix, select vision, barometric altimeter, GPS information to carry out self-adaptive weighted average filtering in the high-altitude stage, the low latitude stage has increased the high module of ultrasonic survey on the basis of above-mentioned three kinds of height sensors, and the self-adaptive weighted average filtering of carrying out four kinds of height sensors is merged.In Fig. 3, j represents time point, H 1(j), H 2(j), H 3(j), H 4(j) be respectively the measured value of j the corresponding vision of time point, barometer, ultrasonic, GPS, w 1(j), w 2(j), w 3(j), w 4(j) be these four corresponding weights of sensor, the fusion that can obtain the j moment is highly:

H ( j ) = Σ i = 1 4 w i ( j ) H i ( j )

In the low latitude stage, according to the measurement characteristics of four sensors, its weights initialization is as follows:

w 1 ( j ) w 2 ( j ) w 3 ( j ) w 4 ( j ) = 25 % 10 % 60 % 5 %

In landing process, if visual pattern resolve in by Hu not bending moment landing ramp do not detected, visual information is invalid, by w 1(j) be set to 0, otherwise, w 1(j) be still 25%.If the quantity of the star that GPS receiver receives is less than 4, there is larger error in GPS information, by w 4(j) be set to 0, otherwise, w 4(j) be still 5%.Each moment point all needs four weights to upgrade by following rule:

w 1 ( j ) w 2 ( j ) w 3 ( j ) w 4 ( j ) = w 1 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) ) w 2 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) ) w 3 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) ) w 4 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) )

In the high-altitude stage, ultrasonic metrical information is unavailable, and according to the measurement characteristics of other three sensors, its weights initialization is as follows:

w 1 ( j ) w 2 ( j ) w 3 ( j ) w 4 ( j ) = 60 % 25 % 0 % 15 %

In landing process, if visual pattern resolve in by Hu not bending moment landing ramp do not detected, visual information is invalid, by w 1(j) be set to 0, otherwise, w 1(j) be still 60%.If the quantity of the star that GPS receiver receives is less than 4, there is larger error in GPS information, by w 4(j) be set to 0, otherwise, w 4(j) be still 15%.Each moment point all needs four weights to upgrade by following rule:

w 1 ( j ) w 2 ( j ) w 3 ( j ) w 4 ( j ) = w 1 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) ) w 2 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) ) w 3 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) ) w 4 ( j ) / ( w 1 ( j ) + w 2 ( j ) + w 3 ( j ) + w 4 ( j ) )

As shown in Figure 4, be communication schematic diagram in unmanned plane landing process.In unmanned plane landing process, catch in real time landing ramp image, and send to terrestrial wireless image transmitting receiving end by the wireless image transmission transmitting terminal of 1.2GHZ, and be sent to ground-based computer VPU by video frequency collection card.The elevation information that VPU output vision is resolved, and send to UAS through the modulator-demodular unit of 900MHZ, with from other height sensors mutually elevation information in the same time carry out filtering fusion, and the elevation information after merging is fed back to flight control system, to guide the safe independent landing of unmanned plane.Meanwhile, the control command of unmanned plane during flying pattern etc. can be realized by wireless system for transmitting data in ground monitoring station.

Claims (3)

1. the autonomous landing system of the rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp, is characterized in that comprising: data processing unit (3) on small-sized rotor wing unmanned aerial vehicle (1), airborne sensor (2), machine, flight control system (4), airborne camera (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9); It is upper that airborne sensor (2), data processing unit (3), flight control system (4) are loaded in respectively unmanned plane (1), and ground monitoring station (9) are made up of VPU (10) and display terminal (11), wherein:
Small-sized rotor wing unmanned aerial vehicle (1) is the carrier of data processing unit (3) and flight control system (4) on airborne sensor (2), machine, and it is this autonomous landing systematic research object;
Airborne sensor (2) main little by some volumes, quality light, meet the important sensor of asking of microminiature UAV system forms, as Inertial Measurement Unit (IMU), GPS (GPS) receiver, magnetic compass, barometric altimeter, ultrasonic etc.IMU independence is strong, short time precision is high, and what energy was continuous provides position, speed and attitude information, but error accumulates in time; GPS information has the whole world, round-the-clock, round-the-clock feature, but its renewal frequency is low, is subject to barrier and blocks; Magnetic compass can provide high-precision course angle information, but affected by external magnetic field; Barometric altimeter can provide elevation information for unmanned plane in the situation that affecting without downwash flow; The ultrasonic accurate measurement that can provide unmanned plane elevation information within region near the ground, but exceed after measurement range, ultrasound information is unavailable;
The hardware core of data processing unit on machine (3) is dsp processor, first the measurement information of each sensor is carried out to unruly-value rejecting, the pre-service such as coordinate and unit are unified, secondly complete the fusion of multiple measurement informations, to provide the navigation information of high-precision high-reliability based on Kalman filtering etc.; Aspect elevation information, native system is according to the characteristic of various height sensors, based on self-adaptive weighted average filtration module, to improve reliability and the accuracy highly measured;
Flight control system (4) completes path planning according to the navigation information of unmanned plane and mission requirements, and the method based on adaptive neural network completes high precision and stable landing control;
Airborne camera (5) is fixed on small-sized rotor wing unmanned aerial vehicle (1), and camera lens vertically downward, in unmanned plane landing process, camera is caught the image of landing ramp, ground (6) in real time, and is transferred to ground monitoring station (9) by wireless image transmission module (7);
Landing ramp, ground (6), as the visual beacon of rotor wing unmanned aerial vehicle, is made up of the multilayer isosceles triangle multicolor patterns of particular design, and its strikingly color and shape facility have improved the probability of terrestrial reference identification greatly;
Wireless image transmission module (7) is made up of image emissions end and receiving end, all needs 12V direct current supply, and carrier frequency is 1.2GHz, and diffracting power is stronger, and laser propagation effect is better.It is upper that transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving end is connected with ground monitoring station by video frequency collection card;
Wireless data transfer module (8) is made up of data transmission end and data receiver equally, all needs 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes and monitors and control task, and it is mainly made up of VPU and display terminal.In order to alleviate airborne load, the landing ramp image that native system is caught airborne camera (5) by wireless image transmission module (7) sends to ground monitoring station (9), and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then pass through wireless data transfer module (8) and send to mobile system.Meanwhile, the information such as the position of the demonstration SRUA that ground display terminal can be real-time, attitude, speed, with facilitate ground staff according to mission requirements to unmanned plane sending controling instruction.
2. the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp according to claim 1, it is characterized in that: on the basis that data processing unit on described machine (3) merges at original much information based on Kalman filtering, the characteristic of four kinds of height sensors that adopt for native system, design the elevation information Fusion Module based on self-adaptive weighted average, not only improve the precision of highly measuring, also can in the insecure situation of a certain height sensor information, self-adaptation carry out weights adjusting, guarantee the reliability of the elevation information in SRUA landing process.
3. the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp according to claim 1, it is characterized in that: described landing ramp, ground (6) is as visual beacon, consider exploitativeness, be easy to distinguish, facilitate with environment vision algorithm realization etc. and design.Unmanned plane is operated in the abundant place of half-tone information more, and in order to prevent surrounding environment introducing noise, the visual beacon in native system is designed to the pure colour of feature distinctness; In order to distinguish with the chromatic information that may exist, again visual beacon is designed to the isosceles triangle of regular shape, with facilitate adopt Hu not bending moment it is carried out to secondary identification; Simultaneously in order to meet the demand of far and near distance high-precision vision navigation, the isosceles triangle pattern of final choice multilayer nest is as landing mark herein, when remote, carry out vision and resolve by extracting the red isosceles triangle characteristic information of outer large scale, closely extract the green isosceles triangle characteristic information of internal layer small size and carry out vision and resolve.
CN201410089860.7A 2014-03-12 2014-03-12 A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system CN103809598B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410089860.7A CN103809598B (en) 2014-03-12 2014-03-12 A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410089860.7A CN103809598B (en) 2014-03-12 2014-03-12 A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system

Publications (2)

Publication Number Publication Date
CN103809598A true CN103809598A (en) 2014-05-21
CN103809598B CN103809598B (en) 2016-08-10

Family

ID=50706530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410089860.7A CN103809598B (en) 2014-03-12 2014-03-12 A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system

Country Status (1)

Country Link
CN (1) CN103809598B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN104166854A (en) * 2014-08-03 2014-11-26 浙江大学 Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle
CN104590552A (en) * 2014-12-08 2015-05-06 天津大学 Miniature multi-rotor aircraft based on visual navigation
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN104808684A (en) * 2015-04-22 2015-07-29 深圳市视晶无线技术有限公司 Aircraft precision positioning and anchoring system and positioning and anchoring method thereof
CN104808685A (en) * 2015-04-27 2015-07-29 中国科学院长春光学精密机械与物理研究所 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN104980708A (en) * 2015-06-30 2015-10-14 江苏首控制造技术有限公司 Hand-held-terminal-contained remote direct-broadcasting monitoring system of unmanned plane
CN104994355A (en) * 2015-07-14 2015-10-21 杨珊珊 Intelligent active image acquisition and update system and method
CN105302146A (en) * 2014-07-25 2016-02-03 空中客车运营简化股份公司 Method and system for automatic autonomous landing of an aircraft
CN105292472A (en) * 2014-07-28 2016-02-03 中国科学院沈阳自动化研究所 Multi-purpose flexible-wing unmanned aerial vehicle
WO2016078025A1 (en) * 2014-11-19 2016-05-26 深圳市大疆创新科技有限公司 Positioning mechanism, uav base station using the positioning mechanism, and uav replenishment method
CN105652887A (en) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 Unmanned aerial vehicle landing method adopting two-level graph recognition
CN105676875A (en) * 2015-03-10 2016-06-15 张超 Automatic landing system of unmanned aerial vehicle
CN105867411A (en) * 2016-04-14 2016-08-17 南方科技大学 Landing method and landing system of unmanned aerial vehicle
CN106371447A (en) * 2016-10-25 2017-02-01 南京奇蛙智能科技有限公司 Controlling method for all-weather precision landing of unmanned aerial vehicle
CN106774386A (en) * 2016-12-06 2017-05-31 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane
WO2017166724A1 (en) * 2016-03-30 2017-10-05 乐视控股(北京)有限公司 Aircraft apron device, aircraft, and aircraft parking system
WO2018006376A1 (en) * 2016-07-07 2018-01-11 SZ DJI Technology Co., Ltd. Method and system for controlling a movable object using machine-readable code
WO2018035835A1 (en) * 2016-08-26 2018-03-01 SZ DJI Technology Co., Ltd. Methods and system for autonomous landing
CN108536167A (en) * 2018-07-17 2018-09-14 哈尔滨工业大学(威海) Unmanned plane Autonomous landing method for tilting non-stationary platform
CN107063261B (en) * 2017-03-29 2020-01-17 东北大学 Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100842101B1 (en) * 2007-06-15 2008-06-30 주식회사 대한항공 Automatic recovery method of uav using vision information
US20090306840A1 (en) * 2008-04-08 2009-12-10 Blenkhorn Kevin P Vision-based automated landing system for unmanned aerial vehicles
WO2010045271A1 (en) * 2008-10-14 2010-04-22 Joshua Victor Aller Target and method of detecting, identifying, and determining 3-d pose of the target
CN101833104A (en) * 2010-04-27 2010-09-15 北京航空航天大学 Three-dimensional visual navigation method based on multi-sensor information fusion
CN101944295A (en) * 2010-09-08 2011-01-12 北京航空航天大学 Method for arranging traffic pattern of unmanned aerial vehicle
CN103208206A (en) * 2013-03-21 2013-07-17 北京航空航天大学 Method for arranging traffic patterns of unmanned aerial vehicles on terrain constraint condition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100842101B1 (en) * 2007-06-15 2008-06-30 주식회사 대한항공 Automatic recovery method of uav using vision information
US20090306840A1 (en) * 2008-04-08 2009-12-10 Blenkhorn Kevin P Vision-based automated landing system for unmanned aerial vehicles
WO2010045271A1 (en) * 2008-10-14 2010-04-22 Joshua Victor Aller Target and method of detecting, identifying, and determining 3-d pose of the target
CN101833104A (en) * 2010-04-27 2010-09-15 北京航空航天大学 Three-dimensional visual navigation method based on multi-sensor information fusion
CN101944295A (en) * 2010-09-08 2011-01-12 北京航空航天大学 Method for arranging traffic pattern of unmanned aerial vehicle
CN103208206A (en) * 2013-03-21 2013-07-17 北京航空航天大学 Method for arranging traffic patterns of unmanned aerial vehicles on terrain constraint condition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
雷旭升等: "一种小型无人旋翼机高度信息融合方法", 《机器人》, vol. 34, no. 4, 30 December 2012 (2012-12-30) *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN105302146A (en) * 2014-07-25 2016-02-03 空中客车运营简化股份公司 Method and system for automatic autonomous landing of an aircraft
CN105292472A (en) * 2014-07-28 2016-02-03 中国科学院沈阳自动化研究所 Multi-purpose flexible-wing unmanned aerial vehicle
CN104166854A (en) * 2014-08-03 2014-11-26 浙江大学 Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle
CN104166854B (en) * 2014-08-03 2016-06-01 浙江大学 For the visual rating scale terrestrial reference positioning identifying method of miniature self-service machine Autonomous landing
US10414517B2 (en) 2014-11-19 2019-09-17 SZ DJI Technology Co., Ltd. Positioning mechanism, UAV dock using same, and UAV replenishment method
WO2016078025A1 (en) * 2014-11-19 2016-05-26 深圳市大疆创新科技有限公司 Positioning mechanism, uav base station using the positioning mechanism, and uav replenishment method
CN104590552A (en) * 2014-12-08 2015-05-06 天津大学 Miniature multi-rotor aircraft based on visual navigation
CN105676875A (en) * 2015-03-10 2016-06-15 张超 Automatic landing system of unmanned aerial vehicle
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105068553A (en) * 2015-03-10 2015-11-18 无锡桑尼安科技有限公司 Unmanned aerial vehicle automatic landing system
CN104808684A (en) * 2015-04-22 2015-07-29 深圳市视晶无线技术有限公司 Aircraft precision positioning and anchoring system and positioning and anchoring method thereof
CN104808684B (en) * 2015-04-22 2017-11-03 深圳市视晶无线技术有限公司 Aircraft precision positioning anchoring system and its positioning anchorage method
CN104808685A (en) * 2015-04-27 2015-07-29 中国科学院长春光学精密机械与物理研究所 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN104980708A (en) * 2015-06-30 2015-10-14 江苏首控制造技术有限公司 Hand-held-terminal-contained remote direct-broadcasting monitoring system of unmanned plane
CN104994355A (en) * 2015-07-14 2015-10-21 杨珊珊 Intelligent active image acquisition and update system and method
CN105652887A (en) * 2016-03-22 2016-06-08 临沂高新区翔鸿电子科技有限公司 Unmanned aerial vehicle landing method adopting two-level graph recognition
WO2017166724A1 (en) * 2016-03-30 2017-10-05 乐视控股(北京)有限公司 Aircraft apron device, aircraft, and aircraft parking system
CN105867411A (en) * 2016-04-14 2016-08-17 南方科技大学 Landing method and landing system of unmanned aerial vehicle
WO2018006376A1 (en) * 2016-07-07 2018-01-11 SZ DJI Technology Co., Ltd. Method and system for controlling a movable object using machine-readable code
WO2018035835A1 (en) * 2016-08-26 2018-03-01 SZ DJI Technology Co., Ltd. Methods and system for autonomous landing
CN106371447A (en) * 2016-10-25 2017-02-01 南京奇蛙智能科技有限公司 Controlling method for all-weather precision landing of unmanned aerial vehicle
CN106774386A (en) * 2016-12-06 2017-05-31 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN106774386B (en) * 2016-12-06 2019-08-13 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane
CN107063261B (en) * 2017-03-29 2020-01-17 东北大学 Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle
CN108536167A (en) * 2018-07-17 2018-09-14 哈尔滨工业大学(威海) Unmanned plane Autonomous landing method for tilting non-stationary platform

Also Published As

Publication number Publication date
CN103809598B (en) 2016-08-10

Similar Documents

Publication Publication Date Title
US9317036B2 (en) Flight control for flight-restricted regions
Watts et al. Small unmanned aircraft systems for low‐altitude aerial surveys
EP2118713B1 (en) Precision approach control
JP2014040231A (en) Autonomous airspace flight planning and virtual airspace containment system
US5235513A (en) Aircraft automatic landing system
US20090138138A1 (en) Imaging and display system to aid helicopter landings in brownout conditions
US10467913B1 (en) Flight assistant
US9310222B1 (en) Flight assistant with automatic configuration and landing site selection method and apparatus
US9513635B1 (en) Unmanned aerial vehicle inspection system
CN104808675B (en) Body-sensing flight control system and terminal device based on intelligent terminal
US20180321041A1 (en) Methods and systems for determining a state of an unmanned aerial vehicle
US9896202B2 (en) Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object
US10061328B2 (en) Autonomous landing and control
Cesetti et al. A vision-based guidance system for UAV navigation and safe landing using natural landmarks
Máthé et al. Vision and control for UAVs: A survey of general methods and of inexpensive platforms for infrastructure inspection
CN103135550B (en) Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
CN104246641A (en) Safe emergency landing of a UAV
Kendoul Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems
US10240930B2 (en) Sensor fusion
CN102707724A (en) Visual localization and obstacle avoidance method and system for unmanned plane
EP2430615A2 (en) Method and system for visual collision detection and estimation
US9958875B2 (en) Autonomous cargo delivery system
US9835453B2 (en) Ground control point assignment and determination system
EP2885676B1 (en) Flying camera with string assembly for localization and interaction
CN102620736A (en) Navigation method for unmanned aerial vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant