CN103809598A - Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground - Google Patents
Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground Download PDFInfo
- Publication number
- CN103809598A CN103809598A CN201410089860.7A CN201410089860A CN103809598A CN 103809598 A CN103809598 A CN 103809598A CN 201410089860 A CN201410089860 A CN 201410089860A CN 103809598 A CN103809598 A CN 103809598A
- Authority
- CN
- China
- Prior art keywords
- landing
- information
- ground
- unmanned plane
- aerial vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
Disclosed is a rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground. The system comprises a small rotor unmanned aircraft (SRUA), an onboard sensor, a data processing unit, a flight control system, an onboard camera, landing ground, a wireless image transmission module, a wireless data transmission module and a ground monitor station, wherein the onboard sensor comprises an inertial measurement unit, a global positioning system (GPS) receiver, a barometer, an ultrasound device and the like, the data processing unit is used for integrating sensor data, the flight control system finishes route planning to achieve high accuracy control over the SRUA, the onboard camera is used for collecting images of the landing ground, the landing ground is a specially designed landing point for an unmanned aircraft, the wireless image transmission module can transmit the images to a ground station, the wireless data transmission module can achieve communication of data and instructions between the unmanned aircraft and the ground station, and the ground station is composed of a visual processing unit and a display terminal. According to the rotor unmanned aircraft independent take-off and landing system based on the three-layer triangle multi-color landing ground, the reliability of SRUA navigation messages is guaranteed, the control accuracy of the SRUA is increased, costs are low, the application is convenient, and important engineering values can be achieved.
Description
Technical field
The autonomous landing system of unmanned plane that the present invention relates to a kind of vision navigation system, can be used for small-sized unmanned aircraft independent navigation and control, is specially adapted to the military and civilian field that unmanned plane landing place is fixed, navigation accuracy is had relatively high expectations.
Background technology
SRUA has the advantages such as size is little, cost is low, maneuverability, can realize the functions such as the landing of taking off vertically, hovering, hedgehopping, and aspect military, civilian and scientific research, application prospect is very extensive.
SRUA takes off very deep with the research of airmanship, and has obtained good result.But the high precision navigation control method in landing stage is still the focus of current research, domestic also in the starting stage.Accurate elevation information is the basis of realizing the autonomous landing of SRUA safety and stability, and the performance of system is had to material impact.Because SRUA Airborne Inertial element height direction is unstable, be subject to the restriction of its size, weight, cost etc., conventionally obtain elevation information accurately by the high module equal-volume of GPS (GPS), the ultrasonic survey device little and low in energy consumption of barometric altimeter, output positioning height.Barometric altimeter highly resolves with the rule of height change accordingly based on atmospheric pressure, and it is simple in structure and capacity of will is strong, but in unmanned plane landing process, rotor causes variations in flow, and therefore accuracy of measurement is difficult to guarantee.The advantages such as positional accuracy is high although gps signal has, error can not accumulate in time, but gps signal renewal frequency is slow, and be easily subject to external interference, especially the working environment of many buildingss in city, more easily there is eclipse phenomena, thereby affect measuring accuracy.And ultrasonicly can only provide accurate metrical information by the altitude range in 4.2 meters.
The single camera vision system of development comparative maturity at present, only needs a camera and a VPU, is landed and is indicated that vector aircraft arrives at landing point by detection.Existing landing ramp mark is mostly polygon combination, H type, the L-type, T-shaped or circular of regular shape, and its common ground is to be all designed to black and white with distinct contrast, so that can be terrestrial reference feature extraction out in the binary conversion treatment of image.Light, environment and camera quality etc. all can cause variation in various degree to black and white mark gray-scale value of corresponding pixel in image, black part gray-scale value is reduced, white portion gray-scale value raises, the two difference is dwindled, therefore, how landing ramp image is carried out to suitable Threshold segmentation processing, to extract, landing ramp characteristic information becomes extremely important and algorithm realization is more complicated.In addition, be subject to the restriction of visual beacon size, as too far away apart from landing ramp in unmanned plane, can make landing ramp information fuzzy and increase measuring error, too closely can cause part landing ramp pattern-information to run off the camera visual field and cause information dropout, therefore the scope of heading of vision system has certain restriction.
Summary of the invention
Technology of the present invention is dealt with problems and is: the deficiency that overcomes existing vision guided navigation technology, by means of three layers of isosceles triangle polychrome landing ramp of particular design, provide that a kind of precision is high, measurement range is wide, strong robustness, economical reliable, be easy to that the monocular vision realized is auxiliary surveys the high autonomous landing system of unmanned plane.
Technical solution of the present invention is: the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp, comprises data processing unit (3) on small-sized rotor wing unmanned aerial vehicle (1), airborne sensor (2), machine, flight control system (4), airborne camera (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9); It is upper that airborne sensor (2), data processing unit (3), flight control system (4) are loaded in respectively unmanned plane (1), and ground monitoring station (9) are made up of VPU (10) and display terminal (11), wherein:
Small-sized rotor wing unmanned aerial vehicle (1) is the carrier of airborne sensor (2), data processing unit (3) and flight control system (4), and it is this autonomous landing systematic research main body;
Airborne sensor (2) main little by some volumes, quality light, the sensor that meets microminiature unmanned plane load-carrying demand forms, as Inertial Measurement Unit (IMU), GPS (GPS) receiver, magnetic compass, barometric altimeter, ultrasonic etc.IMU independence is strong, short time precision is high, and what energy was continuous provides position, speed and attitude information, but error accumulates in time; GPS information has the whole world, round-the-clock, round-the-clock feature, but its renewal frequency is low, is subject to barrier and blocks; Magnetic compass can provide high-precision course angle information, but affected by external magnetic field; Barometric altimeter can provide elevation information for unmanned plane in the situation that affecting without downwash flow; The ultrasonic accurate measurement that can provide unmanned plane elevation information within region near the ground, but exceed after measurement range, ultrasound information is unavailable;
The hardware core of data processing unit on machine (3) is dsp processor, first the measurement information of each sensor is carried out to unruly-value rejecting, the pre-service such as coordinate is unified, secondly complete the fusion of multiple measurement informations, to provide the navigation information of high-precision high-reliability based on Kalman filtering etc.; Aspect elevation information, native system is according to the characteristic of various height sensors, design the elevation information Fusion Module based on self-adaptive weighted average, not only improve the precision of highly measuring, also can in the insecure situation of a certain height sensor information, self-adaptation carry out weights adjusting, guarantee the reliability of the elevation information in SRUA landing process;
Flight control system (4) completes path planning according to the navigation information of unmanned plane and mission requirements, and the method based on adaptive neural network completes high precision and stable landing control;
Airborne camera (5) is fixed on small-sized rotor wing unmanned aerial vehicle (1), and camera lens vertically downward, in unmanned plane landing process, camera is caught the image of landing ramp, ground (6) in real time, and is transferred to ground monitoring station (9) by wireless image transmission module (7);
Landing ramp, ground (6), as the visual beacon of rotor wing unmanned aerial vehicle, is made up of the multilayer isosceles triangle multicolor patterns of particular design, and its strikingly color and shape facility have improved the probability of terrestrial reference identification greatly;
Wireless image transmission module (7) is made up of image emissions end and receiving end, all needs 12V direct current supply, and carrier frequency is 1.2GHz, and diffracting power is stronger, and laser propagation effect is better.It is upper that transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving end is connected with ground monitoring station by video frequency collection card;
Wireless data transfer module (8) is made up of data transmission end and data receiver equally, all needs 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes and monitors and control task, and it is mainly made up of VPU and display terminal.In order to alleviate airborne load, the landing ramp image that native system is caught airborne camera (5) by wireless image transmission module (7) sends to ground monitoring station (9), and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then pass through wireless data transfer module (8) and send to mobile system.Meanwhile, the information such as the position of the demonstration SRUA that ground display terminal can be real-time, attitude, speed, with facilitate ground staff according to mission requirements to unmanned plane sending controling instruction.
Principle of the present invention is: landing ramp is placed on to the pre-landing point of unmanned plane, its size and unique point are demarcated, typing VPU carries out vision system initialization.In unmanned plane landing process, catch in real time landing ramp image, and the Threshold segmentation that carries out image based on color characteristic is to isolate isosceles triangle terrestrial reference, and then based in advance demarcate isosceles triangle terrestrial reference Hu not bending moment value carry out secondary judgement, after twice filtering based on CF is marked in image definitely, extract the terrestrial reference characteristic information in image, can provide the height of the relative landing ramp of current unmanned plane in conjunction with the individual features information in world coordinate system.Vision measures height and carries out self-adaptive weighted average filtering with the measurement information of other height sensors in the same time, obtains the high-precision navigation information of unmanned plane.Elevation information is sent into control system reliably, to guide the landing of unmanned plane safety and stability.
The present invention's advantage is compared with prior art:
(1) measurement information that the present invention selects the filtering method of self-adaptive weighted average to carry out each height sensor merges, and according to the reliability of each height sensor, adjusts at any time the weights of each sensor, guarantees best syncretizing effect.Self-adaptive weighted average has not only guaranteed the measurement accuracy of entire system, simultaneously unreliable in a certain sensor measurement information, as GPS loses star, landing ramp is blocked and causes in the situation of visual information inefficacy, by its weights are independently upgraded, still can assurance system merge reliability highly.
(2) the present invention has utilized color body strikingly color feature to reduce environmental interference, simplified image Processing Algorithm; , disturb for fear of the homochromy information that may exist in physical environment, visual beacon is designed to the special isosceles triangle of shape meanwhile; The nested combination of the descending redness of size, blueness, green isosceles triangle pattern, all has sizeable isosceles triangle terrestrial reference in the image information that different height sections is caught, and has guaranteed the precision of omnidistance vision guided navigation.Landing ramp is easily manufactured, simple economy, and volume is little, and quality is light, is easy to carry.
Accompanying drawing explanation
Fig. 1 is structure composition frame chart of the present invention;
Fig. 2 is vision algorithm design flow diagram of the present invention;
Fig. 3 is self-adaptive weighted average filtering figure of the present invention;
Fig. 4 is communication schematic diagram of the present invention;
Embodiment
As shown in Figure 1, the present invention mainly comprises data processing unit (3) on small-sized rotor wing unmanned aerial vehicle (1), airborne sensor (2), machine, flight control system (4), airborne camera (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9); It is upper that airborne sensor (2), data processing unit (3), flight control system (4) are loaded in respectively unmanned plane (1), and ground monitoring station (9) are made up of VPU (10) and display terminal (11).
Small-sized rotor wing unmanned aerial vehicle (1) is the carrier of airborne sensor (2), data processing unit (3) and flight control system (4), and it is this autonomous landing systematic research main body;
Airborne sensor (2) main little by some volumes, quality light, the sensor that meets microminiature unmanned plane load-carrying demand forms, as Inertial Measurement Unit (IMU), GPS (GPS) receiver, magnetic compass, barometric altimeter, ultrasonic etc.IMU independence is strong, short time precision is high, and what energy was continuous provides position, speed and attitude information, but error accumulates in time; GPS information has the whole world, round-the-clock, round-the-clock feature, but its renewal frequency is low, is subject to barrier and blocks; Magnetic compass can provide high-precision course angle information, but affected by external magnetic field; Barometric altimeter can provide elevation information for unmanned plane in the situation that affecting without downwash flow; The ultrasonic accurate measurement that can provide unmanned plane elevation information within region near the ground, but exceed after measurement range, ultrasound information is unavailable; Therefore, native system selects the effective dose measurement information of differing heights sensor to merge stage by stage, selects above vision, barometric altimeter, GPS for 4.2 meters, selects below vision, ultrasonic, GPS to carry out measurement information fusion for 4.2 meters;
The hardware core of data processing unit on machine (3) is dsp processor, first the measurement information of each sensor is carried out to unruly-value rejecting, the pre-service such as coordinate is unified, secondly complete the fusion of multiple measurement informations, to provide the navigation information of high-precision high-reliability based on Kalman filtering etc.; Aspect elevation information, native system is according to the characteristic of various height sensors, design the elevation information Fusion Module based on self-adaptive weighted average, not only improve the precision of highly measuring, also can in the insecure situation of a certain height sensor information, self-adaptation carry out weights adjusting, guarantee the reliability of the elevation information in SRUA landing process;
Flight control system (4) completes path planning according to the navigation information of unmanned plane and mission requirements, and the method based on adaptive neural network completes high precision and stable landing control;
Airborne camera (5) is fixed on small-sized rotor wing unmanned aerial vehicle (1), and camera lens vertically downward, in unmanned plane landing process, camera is caught the image of landing ramp, ground (6) in real time, and is transferred to ground monitoring station (9) by wireless image transmission module (7);
Landing ramp, ground (6), as the visual beacon of rotor wing unmanned aerial vehicle, is made up of the multilayer isosceles triangle multicolor patterns of particular design, and its strikingly color and shape facility have improved the probability of terrestrial reference identification greatly;
Wireless image transmission module (7) is made up of image emissions end and receiving end, all needs 12V direct current supply, and carrier frequency is 1.2GHz, and diffracting power is stronger, and laser propagation effect is better.It is upper that transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving end is connected with ground monitoring station by video frequency collection card;
Wireless data transfer module (8) is made up of data transmission end and data receiver equally, all needs 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes and monitors and control task, and it is mainly made up of VPU and display terminal.In order to alleviate airborne load, the landing ramp image that native system is caught airborne camera (5) by wireless image transmission module (7) sends to ground monitoring station (9), and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then pass through wireless data transfer module (8) and send to mobile system.Meanwhile, the information such as the position of the demonstration SRUA that ground display terminal can be real-time, attitude, speed, with facilitate ground staff according to mission requirements to unmanned plane sending controling instruction.
As shown in Figure 2, be vision algorithm design flow diagram of the present invention.Unmanned plane carries out before landing work, and vision processing system initialization is carried out according to three chromaticity font polychrome landing ramp sizes and camera internal reference in ground monitoring station.In figure, H
prepresenting the height in a moment on unmanned plane, when system starts, for unmanned plane level point height, give an one guestimate value, is the height after Multi-sensor Fusion in landing process.Vision system captures after real-time landmark image, according to H
pthe terrestrial reference pattern of which kind of color is extracted in value judgement, and then selects the image binaryzation method of respective color.When unmanned plane lands from more than 20 meters high-altitude, vision system selects maximum red isosceles triangle as visual beacon, works as H
pbe less than H
rBtime, be switched to blue mark, near the ground be H
pbe less than H
bGtime, be switched to minimum green mark.Wherein, H
rBand H
bGsize determined by the size of three kinds of isosceles triangle patterns in landing ramp and the focal length of camera, in the present invention, the value of the two is respectively 700cm and 180cm.Utilize Hu not bending moment algorithm calculate the Hu square Hu of the landmark image after binaryzation
imgjudge whether at the Hu demarcating not in torque conversion range, after being marked on definitely in image, carry out terrestrial reference feature extraction, and select the height calculation method of respective color, obtain vision elevation information, carry out auto adapted filtering fusion through the revised visual information of attitude angle and airborne height sensor, obtain current fusion height value H.
As shown in Figure 3, for of the present invention for the self-adaptive weighted average filtration module of height metrical information.First, the elevation information measured to each sensor carries out pre-service, as same in coordinate unit, smothing filtering is to reject accidental error, then take 4.2 meters as high/low empty separatrix, select vision, barometric altimeter, GPS information to carry out self-adaptive weighted average filtering in the high-altitude stage, the low latitude stage has increased the high module of ultrasonic survey on the basis of above-mentioned three kinds of height sensors, and the self-adaptive weighted average filtering of carrying out four kinds of height sensors is merged.In Fig. 3, j represents time point, H
1(j), H
2(j), H
3(j), H
4(j) be respectively the measured value of j the corresponding vision of time point, barometer, ultrasonic, GPS, w
1(j), w
2(j), w
3(j), w
4(j) be these four corresponding weights of sensor, the fusion that can obtain the j moment is highly:
In the low latitude stage, according to the measurement characteristics of four sensors, its weights initialization is as follows:
In landing process, if visual pattern resolve in by Hu not bending moment landing ramp do not detected, visual information is invalid, by w
1(j) be set to 0, otherwise, w
1(j) be still 25%.If the quantity of the star that GPS receiver receives is less than 4, there is larger error in GPS information, by w
4(j) be set to 0, otherwise, w
4(j) be still 5%.Each moment point all needs four weights to upgrade by following rule:
In the high-altitude stage, ultrasonic metrical information is unavailable, and according to the measurement characteristics of other three sensors, its weights initialization is as follows:
In landing process, if visual pattern resolve in by Hu not bending moment landing ramp do not detected, visual information is invalid, by w
1(j) be set to 0, otherwise, w
1(j) be still 60%.If the quantity of the star that GPS receiver receives is less than 4, there is larger error in GPS information, by w
4(j) be set to 0, otherwise, w
4(j) be still 15%.Each moment point all needs four weights to upgrade by following rule:
As shown in Figure 4, be communication schematic diagram in unmanned plane landing process.In unmanned plane landing process, catch in real time landing ramp image, and send to terrestrial wireless image transmitting receiving end by the wireless image transmission transmitting terminal of 1.2GHZ, and be sent to ground-based computer VPU by video frequency collection card.The elevation information that VPU output vision is resolved, and send to UAS through the modulator-demodular unit of 900MHZ, with from other height sensors mutually elevation information in the same time carry out filtering fusion, and the elevation information after merging is fed back to flight control system, to guide the safe independent landing of unmanned plane.Meanwhile, the control command of unmanned plane during flying pattern etc. can be realized by wireless system for transmitting data in ground monitoring station.
Claims (3)
1. the autonomous landing system of the rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp, is characterized in that comprising: data processing unit (3) on small-sized rotor wing unmanned aerial vehicle (1), airborne sensor (2), machine, flight control system (4), airborne camera (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9); It is upper that airborne sensor (2), data processing unit (3), flight control system (4) are loaded in respectively unmanned plane (1), and ground monitoring station (9) are made up of VPU (10) and display terminal (11), wherein:
Small-sized rotor wing unmanned aerial vehicle (1) is the carrier of data processing unit (3) and flight control system (4) on airborne sensor (2), machine, and it is this autonomous landing systematic research object;
Airborne sensor (2) main little by some volumes, quality light, meet the important sensor of asking of microminiature UAV system forms, as Inertial Measurement Unit (IMU), GPS (GPS) receiver, magnetic compass, barometric altimeter, ultrasonic etc.IMU independence is strong, short time precision is high, and what energy was continuous provides position, speed and attitude information, but error accumulates in time; GPS information has the whole world, round-the-clock, round-the-clock feature, but its renewal frequency is low, is subject to barrier and blocks; Magnetic compass can provide high-precision course angle information, but affected by external magnetic field; Barometric altimeter can provide elevation information for unmanned plane in the situation that affecting without downwash flow; The ultrasonic accurate measurement that can provide unmanned plane elevation information within region near the ground, but exceed after measurement range, ultrasound information is unavailable;
The hardware core of data processing unit on machine (3) is dsp processor, first the measurement information of each sensor is carried out to unruly-value rejecting, the pre-service such as coordinate and unit are unified, secondly complete the fusion of multiple measurement informations, to provide the navigation information of high-precision high-reliability based on Kalman filtering etc.; Aspect elevation information, native system is according to the characteristic of various height sensors, based on self-adaptive weighted average filtration module, to improve reliability and the accuracy highly measured;
Flight control system (4) completes path planning according to the navigation information of unmanned plane and mission requirements, and the method based on adaptive neural network completes high precision and stable landing control;
Airborne camera (5) is fixed on small-sized rotor wing unmanned aerial vehicle (1), and camera lens vertically downward, in unmanned plane landing process, camera is caught the image of landing ramp, ground (6) in real time, and is transferred to ground monitoring station (9) by wireless image transmission module (7);
Landing ramp, ground (6), as the visual beacon of rotor wing unmanned aerial vehicle, is made up of the multilayer isosceles triangle multicolor patterns of particular design, and its strikingly color and shape facility have improved the probability of terrestrial reference identification greatly;
Wireless image transmission module (7) is made up of image emissions end and receiving end, all needs 12V direct current supply, and carrier frequency is 1.2GHz, and diffracting power is stronger, and laser propagation effect is better.It is upper that transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving end is connected with ground monitoring station by video frequency collection card;
Wireless data transfer module (8) is made up of data transmission end and data receiver equally, all needs 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes and monitors and control task, and it is mainly made up of VPU and display terminal.In order to alleviate airborne load, the landing ramp image that native system is caught airborne camera (5) by wireless image transmission module (7) sends to ground monitoring station (9), and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then pass through wireless data transfer module (8) and send to mobile system.Meanwhile, the information such as the position of the demonstration SRUA that ground display terminal can be real-time, attitude, speed, with facilitate ground staff according to mission requirements to unmanned plane sending controling instruction.
2. the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp according to claim 1, it is characterized in that: on the basis that data processing unit on described machine (3) merges at original much information based on Kalman filtering, the characteristic of four kinds of height sensors that adopt for native system, design the elevation information Fusion Module based on self-adaptive weighted average, not only improve the precision of highly measuring, also can in the insecure situation of a certain height sensor information, self-adaptation carry out weights adjusting, guarantee the reliability of the elevation information in SRUA landing process.
3. the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp according to claim 1, it is characterized in that: described landing ramp, ground (6) is as visual beacon, consider exploitativeness, be easy to distinguish, facilitate with environment vision algorithm realization etc. and design.Unmanned plane is operated in the abundant place of half-tone information more, and in order to prevent surrounding environment introducing noise, the visual beacon in native system is designed to the pure colour of feature distinctness; In order to distinguish with the chromatic information that may exist, again visual beacon is designed to the isosceles triangle of regular shape, with facilitate adopt Hu not bending moment it is carried out to secondary identification; Simultaneously in order to meet the demand of far and near distance high-precision vision navigation, the isosceles triangle pattern of final choice multilayer nest is as landing mark herein, when remote, carry out vision and resolve by extracting the red isosceles triangle characteristic information of outer large scale, closely extract the green isosceles triangle characteristic information of internal layer small size and carry out vision and resolve.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410089860.7A CN103809598B (en) | 2014-03-12 | 2014-03-12 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410089860.7A CN103809598B (en) | 2014-03-12 | 2014-03-12 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103809598A true CN103809598A (en) | 2014-05-21 |
CN103809598B CN103809598B (en) | 2016-08-10 |
Family
ID=50706530
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410089860.7A Active CN103809598B (en) | 2014-03-12 | 2014-03-12 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103809598B (en) |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104049641A (en) * | 2014-05-29 | 2014-09-17 | 深圳市大疆创新科技有限公司 | Automatic landing method and device and air vehicle |
CN104166854A (en) * | 2014-08-03 | 2014-11-26 | 浙江大学 | Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle |
CN104590552A (en) * | 2014-12-08 | 2015-05-06 | 天津大学 | Miniature multi-rotor aircraft based on visual navigation |
CN104679013A (en) * | 2015-03-10 | 2015-06-03 | 无锡桑尼安科技有限公司 | Unmanned plane automatic landing system |
CN104808685A (en) * | 2015-04-27 | 2015-07-29 | 中国科学院长春光学精密机械与物理研究所 | Vision auxiliary device and method for automatic landing of unmanned aerial vehicle |
CN104808684A (en) * | 2015-04-22 | 2015-07-29 | 深圳市视晶无线技术有限公司 | Aircraft precision positioning and anchoring system and positioning and anchoring method thereof |
CN104965213A (en) * | 2015-05-27 | 2015-10-07 | 深圳市高巨创新科技开发有限公司 | Unmanned aircraft positioning method and apparatus |
CN104980708A (en) * | 2015-06-30 | 2015-10-14 | 江苏首控制造技术有限公司 | Hand-held-terminal-contained remote direct-broadcasting monitoring system of unmanned plane |
CN104994355A (en) * | 2015-07-14 | 2015-10-21 | 杨珊珊 | Intelligent active image acquisition and update system and method |
CN105302146A (en) * | 2014-07-25 | 2016-02-03 | 空中客车运营简化股份公司 | Method and system for automatic autonomous landing of an aircraft |
CN105292472A (en) * | 2014-07-28 | 2016-02-03 | 中国科学院沈阳自动化研究所 | Multi-purpose flexible-wing unmanned aerial vehicle |
WO2016078025A1 (en) * | 2014-11-19 | 2016-05-26 | 深圳市大疆创新科技有限公司 | Positioning mechanism, uav base station using the positioning mechanism, and uav replenishment method |
CN105652887A (en) * | 2016-03-22 | 2016-06-08 | 临沂高新区翔鸿电子科技有限公司 | Unmanned aerial vehicle landing method adopting two-level graph recognition |
CN105676875A (en) * | 2015-03-10 | 2016-06-15 | 张超 | Automatic landing system of unmanned aerial vehicle |
CN105867411A (en) * | 2016-04-14 | 2016-08-17 | 南方科技大学 | Landing method and landing system of unmanned aerial vehicle |
CN106371447A (en) * | 2016-10-25 | 2017-02-01 | 南京奇蛙智能科技有限公司 | Controlling method for all-weather precision landing of unmanned aerial vehicle |
CN106774386A (en) * | 2016-12-06 | 2017-05-31 | 杭州灵目科技有限公司 | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker |
CN107063261A (en) * | 2017-03-29 | 2017-08-18 | 东北大学 | The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane |
WO2017166724A1 (en) * | 2016-03-30 | 2017-10-05 | 乐视控股(北京)有限公司 | Aircraft apron device, aircraft, and aircraft parking system |
CN107515615A (en) * | 2017-08-30 | 2017-12-26 | 泸州深远世宁无人机科技有限公司 | For the landing-gear of unmanned plane in particular circumstances |
WO2018006376A1 (en) * | 2016-07-07 | 2018-01-11 | SZ DJI Technology Co., Ltd. | Method and system for controlling a movable object using machine-readable code |
CN107710091A (en) * | 2015-06-26 | 2018-02-16 | 深圳市大疆创新科技有限公司 | For the system and method for the operator scheme for selecting mobile platform |
WO2018035835A1 (en) * | 2016-08-26 | 2018-03-01 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
CN107885219A (en) * | 2014-04-22 | 2018-04-06 | 天津远翥科技有限公司 | For monitoring the flight monitoring system and method for unmanned plane during flying |
CN108227751A (en) * | 2018-01-29 | 2018-06-29 | 广州亿航智能技术有限公司 | The landing method and system of a kind of unmanned plane |
CN108536167A (en) * | 2018-07-17 | 2018-09-14 | 哈尔滨工业大学(威海) | Unmanned plane Autonomous landing method for tilting non-stationary platform |
CN108733068A (en) * | 2017-04-24 | 2018-11-02 | 菜鸟智能物流控股有限公司 | Aircraft with a flight control device |
CN108919830A (en) * | 2018-07-20 | 2018-11-30 | 南京奇蛙智能科技有限公司 | A kind of flight control method that unmanned plane precisely lands |
CN110244749A (en) * | 2019-04-22 | 2019-09-17 | 西北农林科技大学 | A kind of agricultural unmanned plane mobile platform independently precisely lands control system and method |
CN110322462A (en) * | 2019-06-13 | 2019-10-11 | 暨南大学 | Unmanned aerial vehicle vision based on 5G network feels land method and system |
CN111142546A (en) * | 2019-11-22 | 2020-05-12 | 航天时代飞鸿技术有限公司 | Multi-rotor unmanned aerial vehicle accurate landing guiding system and method |
CN111137463A (en) * | 2020-01-15 | 2020-05-12 | 亿航智能设备(广州)有限公司 | Unmanned aerial vehicle landing guiding system and method, charging platform and unmanned aerial vehicle |
CN111596674A (en) * | 2020-03-31 | 2020-08-28 | 广州中科云图智能科技有限公司 | Landing positioning method and device for unmanned aerial vehicle and unmanned aerial vehicle nest |
CN111598952A (en) * | 2020-05-21 | 2020-08-28 | 华中科技大学 | Multi-scale cooperative target design and online detection and identification method and system |
CN111857171A (en) * | 2020-07-30 | 2020-10-30 | 华南理工大学 | Unmanned aerial vehicle control method based on quality self-adaptive multilayer neural network |
CN112600632A (en) * | 2020-11-14 | 2021-04-02 | 泰州芯源半导体科技有限公司 | Wireless data communication platform using signal analysis |
CN112904879A (en) * | 2021-01-18 | 2021-06-04 | 天津大学 | Autonomous tracking landing system and method for quad-rotor aircraft based on color block landmarks |
CN114115010A (en) * | 2021-11-18 | 2022-03-01 | 中国特种飞行器研究所 | Flying height control system for unmanned fire balloon |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100842101B1 (en) * | 2007-06-15 | 2008-06-30 | 주식회사 대한항공 | Automatic recovery method of uav using vision information |
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
WO2010045271A1 (en) * | 2008-10-14 | 2010-04-22 | Joshua Victor Aller | Target and method of detecting, identifying, and determining 3-d pose of the target |
CN101833104A (en) * | 2010-04-27 | 2010-09-15 | 北京航空航天大学 | Three-dimensional visual navigation method based on multi-sensor information fusion |
CN101944295A (en) * | 2010-09-08 | 2011-01-12 | 北京航空航天大学 | Method for arranging traffic pattern of unmanned aerial vehicle |
CN103208206A (en) * | 2013-03-21 | 2013-07-17 | 北京航空航天大学 | Method for arranging traffic patterns of unmanned aerial vehicles on terrain constraint condition |
-
2014
- 2014-03-12 CN CN201410089860.7A patent/CN103809598B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100842101B1 (en) * | 2007-06-15 | 2008-06-30 | 주식회사 대한항공 | Automatic recovery method of uav using vision information |
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
WO2010045271A1 (en) * | 2008-10-14 | 2010-04-22 | Joshua Victor Aller | Target and method of detecting, identifying, and determining 3-d pose of the target |
CN101833104A (en) * | 2010-04-27 | 2010-09-15 | 北京航空航天大学 | Three-dimensional visual navigation method based on multi-sensor information fusion |
CN101944295A (en) * | 2010-09-08 | 2011-01-12 | 北京航空航天大学 | Method for arranging traffic pattern of unmanned aerial vehicle |
CN103208206A (en) * | 2013-03-21 | 2013-07-17 | 北京航空航天大学 | Method for arranging traffic patterns of unmanned aerial vehicles on terrain constraint condition |
Non-Patent Citations (1)
Title |
---|
雷旭升等: "一种小型无人旋翼机高度信息融合方法", 《机器人》, vol. 34, no. 4, 30 December 2012 (2012-12-30) * |
Cited By (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107885219A (en) * | 2014-04-22 | 2018-04-06 | 天津远翥科技有限公司 | For monitoring the flight monitoring system and method for unmanned plane during flying |
CN104049641A (en) * | 2014-05-29 | 2014-09-17 | 深圳市大疆创新科技有限公司 | Automatic landing method and device and air vehicle |
CN105302146A (en) * | 2014-07-25 | 2016-02-03 | 空中客车运营简化股份公司 | Method and system for automatic autonomous landing of an aircraft |
CN105292472A (en) * | 2014-07-28 | 2016-02-03 | 中国科学院沈阳自动化研究所 | Multi-purpose flexible-wing unmanned aerial vehicle |
CN104166854A (en) * | 2014-08-03 | 2014-11-26 | 浙江大学 | Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle |
CN104166854B (en) * | 2014-08-03 | 2016-06-01 | 浙江大学 | For the visual rating scale terrestrial reference positioning identifying method of miniature self-service machine Autonomous landing |
US11524796B2 (en) | 2014-11-19 | 2022-12-13 | SZ DJI Technology Co., Ltd. | Positioning mechanism, UAV dock using same, and UAV replenishment method |
US10414517B2 (en) | 2014-11-19 | 2019-09-17 | SZ DJI Technology Co., Ltd. | Positioning mechanism, UAV dock using same, and UAV replenishment method |
WO2016078025A1 (en) * | 2014-11-19 | 2016-05-26 | 深圳市大疆创新科技有限公司 | Positioning mechanism, uav base station using the positioning mechanism, and uav replenishment method |
CN104590552A (en) * | 2014-12-08 | 2015-05-06 | 天津大学 | Miniature multi-rotor aircraft based on visual navigation |
CN105676875A (en) * | 2015-03-10 | 2016-06-15 | 张超 | Automatic landing system of unmanned aerial vehicle |
CN105068553A (en) * | 2015-03-10 | 2015-11-18 | 无锡桑尼安科技有限公司 | Unmanned aerial vehicle automatic landing system |
CN104679013A (en) * | 2015-03-10 | 2015-06-03 | 无锡桑尼安科技有限公司 | Unmanned plane automatic landing system |
CN104808684A (en) * | 2015-04-22 | 2015-07-29 | 深圳市视晶无线技术有限公司 | Aircraft precision positioning and anchoring system and positioning and anchoring method thereof |
CN104808684B (en) * | 2015-04-22 | 2017-11-03 | 深圳市视晶无线技术有限公司 | Aircraft precision positioning anchoring system and its positioning anchorage method |
CN104808685A (en) * | 2015-04-27 | 2015-07-29 | 中国科学院长春光学精密机械与物理研究所 | Vision auxiliary device and method for automatic landing of unmanned aerial vehicle |
CN104965213A (en) * | 2015-05-27 | 2015-10-07 | 深圳市高巨创新科技开发有限公司 | Unmanned aircraft positioning method and apparatus |
US10730618B2 (en) | 2015-06-26 | 2020-08-04 | SZ DJI Technology Co., Ltd. | System and method for selecting an operation mode of a mobile platform |
US11465743B2 (en) | 2015-06-26 | 2022-10-11 | SZ DJI Technology Co., Ltd. | System and method for selecting an operation mode of a mobile platform |
CN107710091B (en) * | 2015-06-26 | 2021-07-16 | 深圳市大疆创新科技有限公司 | System and method for selecting an operating mode of a mobile platform |
CN107710091A (en) * | 2015-06-26 | 2018-02-16 | 深圳市大疆创新科技有限公司 | For the system and method for the operator scheme for selecting mobile platform |
CN104980708A (en) * | 2015-06-30 | 2015-10-14 | 江苏首控制造技术有限公司 | Hand-held-terminal-contained remote direct-broadcasting monitoring system of unmanned plane |
CN104994355A (en) * | 2015-07-14 | 2015-10-21 | 杨珊珊 | Intelligent active image acquisition and update system and method |
CN105652887A (en) * | 2016-03-22 | 2016-06-08 | 临沂高新区翔鸿电子科技有限公司 | Unmanned aerial vehicle landing method adopting two-level graph recognition |
WO2017166724A1 (en) * | 2016-03-30 | 2017-10-05 | 乐视控股(北京)有限公司 | Aircraft apron device, aircraft, and aircraft parking system |
CN105867411A (en) * | 2016-04-14 | 2016-08-17 | 南方科技大学 | Landing method and landing system of unmanned aerial vehicle |
US10901436B2 (en) | 2016-07-07 | 2021-01-26 | SZ DJI Technology Co., Ltd. | Method and system for controlling a movable object using machine-readable code |
WO2018006376A1 (en) * | 2016-07-07 | 2018-01-11 | SZ DJI Technology Co., Ltd. | Method and system for controlling a movable object using machine-readable code |
WO2018035835A1 (en) * | 2016-08-26 | 2018-03-01 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
US11194344B2 (en) | 2016-08-26 | 2021-12-07 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
US11693428B2 (en) | 2016-08-26 | 2023-07-04 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
CN106371447A (en) * | 2016-10-25 | 2017-02-01 | 南京奇蛙智能科技有限公司 | Controlling method for all-weather precision landing of unmanned aerial vehicle |
CN106774386B (en) * | 2016-12-06 | 2019-08-13 | 杭州灵目科技有限公司 | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker |
CN106774386A (en) * | 2016-12-06 | 2017-05-31 | 杭州灵目科技有限公司 | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker |
CN107063261A (en) * | 2017-03-29 | 2017-08-18 | 东北大学 | The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane |
CN107063261B (en) * | 2017-03-29 | 2020-01-17 | 东北大学 | Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle |
CN108733068A (en) * | 2017-04-24 | 2018-11-02 | 菜鸟智能物流控股有限公司 | Aircraft with a flight control device |
CN107515615A (en) * | 2017-08-30 | 2017-12-26 | 泸州深远世宁无人机科技有限公司 | For the landing-gear of unmanned plane in particular circumstances |
CN108227751A (en) * | 2018-01-29 | 2018-06-29 | 广州亿航智能技术有限公司 | The landing method and system of a kind of unmanned plane |
CN108227751B (en) * | 2018-01-29 | 2020-12-29 | 广州亿航智能技术有限公司 | Landing method and system of unmanned aerial vehicle |
CN108536167A (en) * | 2018-07-17 | 2018-09-14 | 哈尔滨工业大学(威海) | Unmanned plane Autonomous landing method for tilting non-stationary platform |
CN108919830A (en) * | 2018-07-20 | 2018-11-30 | 南京奇蛙智能科技有限公司 | A kind of flight control method that unmanned plane precisely lands |
CN110244749A (en) * | 2019-04-22 | 2019-09-17 | 西北农林科技大学 | A kind of agricultural unmanned plane mobile platform independently precisely lands control system and method |
CN110322462B (en) * | 2019-06-13 | 2021-07-27 | 暨南大学 | Unmanned aerial vehicle visual landing method and system based on 5G network |
CN110322462A (en) * | 2019-06-13 | 2019-10-11 | 暨南大学 | Unmanned aerial vehicle vision based on 5G network feels land method and system |
CN111142546A (en) * | 2019-11-22 | 2020-05-12 | 航天时代飞鸿技术有限公司 | Multi-rotor unmanned aerial vehicle accurate landing guiding system and method |
CN111137463A (en) * | 2020-01-15 | 2020-05-12 | 亿航智能设备(广州)有限公司 | Unmanned aerial vehicle landing guiding system and method, charging platform and unmanned aerial vehicle |
CN111596674A (en) * | 2020-03-31 | 2020-08-28 | 广州中科云图智能科技有限公司 | Landing positioning method and device for unmanned aerial vehicle and unmanned aerial vehicle nest |
CN111598952A (en) * | 2020-05-21 | 2020-08-28 | 华中科技大学 | Multi-scale cooperative target design and online detection and identification method and system |
CN111598952B (en) * | 2020-05-21 | 2022-07-08 | 华中科技大学 | Multi-scale cooperative target design and online detection identification method and system |
CN111857171A (en) * | 2020-07-30 | 2020-10-30 | 华南理工大学 | Unmanned aerial vehicle control method based on quality self-adaptive multilayer neural network |
CN112600632A (en) * | 2020-11-14 | 2021-04-02 | 泰州芯源半导体科技有限公司 | Wireless data communication platform using signal analysis |
CN112904879A (en) * | 2021-01-18 | 2021-06-04 | 天津大学 | Autonomous tracking landing system and method for quad-rotor aircraft based on color block landmarks |
CN112904879B (en) * | 2021-01-18 | 2024-04-12 | 天津大学 | Autonomous tracking landing system and method for four-rotor aircraft based on color block landmarks |
CN114115010A (en) * | 2021-11-18 | 2022-03-01 | 中国特种飞行器研究所 | Flying height control system for unmanned fire balloon |
Also Published As
Publication number | Publication date |
---|---|
CN103809598B (en) | 2016-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103809598A (en) | Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground | |
US20230350428A1 (en) | Methods and system for autonomous landing | |
US10824170B2 (en) | Autonomous cargo delivery system | |
US20200344464A1 (en) | Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects | |
CN103869822B (en) | The perception of many rotor wing unmanned aerial vehicles and avoidance system and bypassing method thereof | |
Kendoul | Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems | |
CN113597591A (en) | Geographic reference for unmanned aerial vehicle navigation | |
US20190068829A1 (en) | Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions | |
US20170313439A1 (en) | Methods and syststems for obstruction detection during autonomous unmanned aerial vehicle landings | |
CN104015931B (en) | Vision localization, measurement and control method, system and experimental platform for automatic refueling dead zone of unmanned aerial vehicle | |
CN204406209U (en) | Autonomous Landing of UAV system | |
CN104808675A (en) | Intelligent terminal-based somatosensory flight operation and control system and terminal equipment | |
CN104808685A (en) | Vision auxiliary device and method for automatic landing of unmanned aerial vehicle | |
CN107144281B (en) | Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision | |
CN104679013A (en) | Unmanned plane automatic landing system | |
EP3077760B1 (en) | Payload delivery | |
CN107783106A (en) | Data fusion method between unmanned plane and barrier | |
CN103852077A (en) | Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process | |
CN102190081A (en) | Vision-based fixed point robust control method for airship | |
CN102788579A (en) | Unmanned aerial vehicle visual navigation method based on SIFT algorithm | |
CN107783119A (en) | Apply the Decision fusion method in obstacle avoidance system | |
CN105208346B (en) | Transmission facility identification method based on unmanned plane | |
RU155323U1 (en) | UNMANNED AIRCRAFT CONTROL SYSTEM | |
CN112639655A (en) | Control method and device for return flight of unmanned aerial vehicle, movable platform and storage medium | |
KR102467855B1 (en) | A method for setting an autonomous navigation map, a method for an unmanned aerial vehicle to fly autonomously based on an autonomous navigation map, and a system for implementing the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |