CN104679013A - Unmanned plane automatic landing system - Google Patents
Unmanned plane automatic landing system Download PDFInfo
- Publication number
- CN104679013A CN104679013A CN201510104994.6A CN201510104994A CN104679013A CN 104679013 A CN104679013 A CN 104679013A CN 201510104994 A CN201510104994 A CN 201510104994A CN 104679013 A CN104679013 A CN 104679013A
- Authority
- CN
- China
- Prior art keywords
- terrestrial reference
- doubtful
- unmanned plane
- image
- height
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Abstract
The invention relates to an unmanned plane automatic landing system. The system comprises an aerial photography camera, a landmark positioner, unmanned plane driving equipment and a main controller, wherein the aerial photography camera shoots suspected landmark region under an unmanned plane, so as to obtain a suspected landmark image; the landmark positioner performs image processing on the suspected landmark image, so as to obtain the relative height and the relative positioning distance from the unmanned plane to a landmark when the fact that the landmark exists in the suspected landmark image is determined; the main controller is respectively connected with the landmark positioner and the unmanned plane driving equipment, and controls the unmanned plane driving equipment to drive the unmanned plane to be landed on the landmark according to the relative height and the relative positioning distance. The unmanned plane automatic landing system can accomplish automatic and accurate landing of the unmanned plane in various different landforms, and improves the intelligence level of the unmanned plane.
Description
Technical field
The present invention relates to Navigation of Pilotless Aircraft field, particularly relate to a kind of Autonomous Landing of UAV system.
Background technology
Along with reaching its maturity and the further expansion of air photo technique of unmanned air vehicle technique, in military unmanned air vehicle application, unmanned plane is usually used in the combat support investigating the forms such as supervision, more crucially, the application of civilian unmanned plane is also increasingly extensive, comprising: photogrammetric, emergency disaster relief, public safety, resource exploration, environmental monitoring, Natural calamity monitoring and assessment, city planning and municipal administration, forest fires disease and pest protect and monitoring etc.
Because unmanned plane plays more and more important effect in every field, the intelligent requirements of people to unmanned plane during flying is also more and more higher.In general, the control of the aspect of taking off of unmanned plane more for convenience, and the landing recovery of unmanned plane is difficult to control, but the independent landing of unmanned plane reclaims after completing assigned tasks to unmanned plane and significant, the accuracy therefore improving unmanned plane independent landing greatly can improve the intelligent level of unmanned plane.
In prior art, the independent landing of unmanned plane adopts inertial navigation system INS, GPS navigation system or INS and GPS integrated navigation system.But the apparatus expensive of INS, simple GPS navigation system accuracy are not high, and inertial navigation system INS, GPS navigation system or INS and GPS integrated navigation system all can not realize the full intellectualized of landing, in addition, existing UAV Landing controls to be confined to the terrestrial reference of specific landform and finds, locates and land, and cannot realize the Autonomous Landing of UAV under all landform.
Therefore, need a kind of new Autonomous Landing of UAV scheme, good equilibrium can be obtained between cost and precision, there is good cost performance, meanwhile, the landing terrestrial reference that can search out smoothly under various landform also completely automatically realizes the uneventful landing of unmanned plane.
Summary of the invention
In order to solve the problem, the invention provides a kind of Autonomous Landing of UAV system, mate by adopting the affine invariant moment features of terrestrial reference template matches and terrestrial reference pattern the mode combined and accurately search out landing terrestrial reference, unmanned plane distance landing field target relative position is obtained by airmanship, height measurement technique and image processing techniques, simultaneously the conbined usage of landform recognition technology and wireless communication technology achieves the landing field target search under different terrain, whole landing system is interfered without the need to the external world, completely automatically, realize, and there is suitable cost performance.
According to an aspect of the present invention, provide a kind of Autonomous Landing of UAV system, described system comprises aerial camera, terrestrial reference steady arm, unmanned plane driving arrangement and master controller, described aerial camera is taken to obtain doubtful landmark image to the doubtful landmark region under machine, described terrestrial reference steady arm carries out image procossing to obtain relative height and the relative positioning distance of unmanned plane distance terrestrial reference when determining to there is terrestrial reference in described doubtful landmark image to described doubtful landmark image, described master controller is connected respectively with described terrestrial reference steady arm and described unmanned plane driving arrangement, based on unmanned plane driving arrangement described in described relative height and described relative positioning distance controlling with drive described UAV Landing to put on.
More specifically, in described Autonomous Landing of UAV system, also comprise: landform recognizer, be connected with described aerial camera, for the type based on landform under described doubtful landmark image determination machine, under described machine, the type of landform comprises Plain, meadow, mountain area, desert, naval vessels, Gobi desert, city and hills, wireless transmitting-receiving equipments, two-way wireless communication link is set up with the unmanned aerial vehicle (UAV) control platform of far-end, be connected respectively with landform recognizer and memory device, the type of landform under described machine is sent to described unmanned aerial vehicle (UAV) control platform, to receive, described unmanned aerial vehicle (UAV) control platform returns, the terrestrial reference corresponding with the type of landform under described machine splits color data, and described terrestrial reference segmentation color data is stored in memory device, described terrestrial reference segmentation color data comprises terrestrial reference R channel range, terrestrial reference G channel range and terrestrial reference channel B scope, described terrestrial reference R channel range, described terrestrial reference G channel range and described terrestrial reference channel B scope are used for the terrestrial reference in RGB image to be separated with RGB image background, GPS locator, is connected with GPS navigation satellite, for receiving the real time positioning data of unmanned plane position, memory device, for prestoring preset height scope, preset pressure elevation weight and default ultrasonic height weight, also for the affine invariant moment features of the terrestrial reference of the benchmark image template and various species that prestore the terrestrial reference of various species, the benchmark image template of the terrestrial reference of each kind is for taking obtained pattern in advance to the benchmark terrestrial reference of each kind, and the affine invariant moment features of the terrestrial reference of each kind extracts the benchmark image template from the terrestrial reference of each kind, highly sensing equipment, is connected with described memory device, comprises pressure-altitude sensor, ultrasonic height sensors and microcontroller, described pressure-altitude sensor is used for according to the air pressure change near unmanned plane, detects the real-time barometer altitude of unmanned plane position, described ultrasonic height sensors comprises ultrasound wave emitter, ultrasonic receiver and single-chip microcomputer, described single-chip microcomputer is connected respectively with described ultrasound wave emitter and described ultrasonic receiver, described ultrasound wave emitter launches ultrasound wave earthward, described ultrasonic receiver receives the ultrasound wave of ground return, and described single-chip microcomputer calculates the real-time ultrasound wave height of unmanned plane according to the launch time of described ultrasound wave emitter, the time of reception of described ultrasonic receiver and ultrasonic propagation velocity, described microcontroller is connected respectively with described pressure-altitude sensor, described ultrasonic height sensors and described memory device, when the difference of described real-time barometer altitude and described real-time ultrasound wave height is in described preset height scope, calculate based on described preset pressure elevation weight, described default ultrasonic height weight, described real-time barometer altitude and described real-time ultrasound wave height and export described real-time height, when the difference of described real-time barometer altitude and described real-time ultrasound wave height is not in described preset height scope, export height detection failure signal, described aerial camera is linear array digital aviation video camera, comprises undercarriage having shock absorption function, front cover glass, camera lens, filter and image-forming electron unit, for taking to obtain doubtful landmark image to described doubtful landmark region, described terrestrial reference steady arm is connected respectively with described aerial camera, described memory device, described GPS locator and described highly sensing equipment, comprises Image semantic classification subset, the segmentation of doubtful terrestrial reference subset, terrestrial reference recognin equipment and relative position detect subset, described Image semantic classification subset carries out contrast enhancement processing, medium filtering process and rgb color space conversion process successively to described doubtful landmark image, to obtain doubtful terrestrial reference RGB image, described doubtful terrestrial reference segmentation subset is connected respectively with described Image semantic classification subset and described memory device, calculate the R channel value of each pixel in described doubtful terrestrial reference RGB image, G channel value and channel B value, when the R channel value of a certain pixel in described terrestrial reference R channel range, G channel value in described terrestrial reference G channel range and channel B value within the scope of described terrestrial reference channel B time, be defined as doubtful terrestrial reference pixel, by all doubtful terrestrial reference combination of pixels in described doubtful terrestrial reference RGB image to form doubtful terrestrial reference sub pattern, described terrestrial reference recognin equipment splits subset with described doubtful terrestrial reference and described memory device is connected respectively, calculate the affine invariant moment features of described doubtful terrestrial reference sub pattern, described doubtful terrestrial reference sub pattern is mated one by one with the benchmark image template of the terrestrial reference of various species, it fails to match then exports without terrestrial reference signal, the match is successful then obtains the terrestrial reference type of coupling and is compared with the affine invariant moment features of described doubtful terrestrial reference sub pattern by affine invariant moment features corresponding with the terrestrial reference type of coupling in described memory device, difference then exports without terrestrial reference signal, identical, export and there is terrestrial reference signal, the terrestrial reference type of coupling and the image relative position of described doubtful terrestrial reference sub pattern in described doubtful terrestrial reference RGB image, described relative position detects subset and is connected respectively with described terrestrial reference recognin equipment, described GPS locator and described highly sensing equipment, receive there is terrestrial reference signal time, the relative positioning distance of described unmanned plane distance terrestrial reference is calculated based on described image relative position and described real time positioning data, and using the relative height of described real-time height as described unmanned plane distance terrestrial reference, described master controller is connected respectively with described terrestrial reference steady arm, described highly sensing equipment and described unmanned plane driving arrangement, receive described height detection failure signal or described without terrestrial reference signal time, by described height detection failure signal or be describedly transmitted to described unmanned aerial vehicle (UAV) control platform without terrestrial reference signal by described wireless transmitting-receiving equipments, described master controller receive described relative positioning distance and receive described relative height time, based on relative height described in described relative positioning Distance geometry control described unmanned plane driving arrangement with drive described UAV Landing to put on.
More specifically, in described Autonomous Landing of UAV system: when under described machine, the type of landform is Plain, in the terrestrial reference segmentation color data that under described machine, the type of landform is corresponding, terrestrial reference R channel range is 150 to 255, terrestrial reference G channel range is 0 to 120, and terrestrial reference channel B scope is 1 to 150.
More specifically, in described Autonomous Landing of UAV system: described aerial camera is ultra high-definition aerial camera, the resolution of captured doubtful landmark image is 3840 × 2160.
More specifically, in described Autonomous Landing of UAV system: described Image semantic classification subset, described doubtful terrestrial reference segmentation subset, described terrestrial reference recognin equipment and described relative position detect subset and adopt different fpga chips respectively to realize.
More specifically, in described Autonomous Landing of UAV system: described Image semantic classification subset, described doubtful terrestrial reference segmentation subset, described terrestrial reference recognin equipment and described relative position are detected subset and be integrated on one piece of surface-mounted integrated circuit.
More specifically, in described Autonomous Landing of UAV system: described unmanned plane is rotor wing unmanned aerial vehicle.
Accompanying drawing explanation
Below with reference to accompanying drawing, embodiment of the present invention are described, wherein:
Fig. 1 is the block diagram of the Autonomous Landing of UAV system illustrated according to an embodiment of the present invention.
Fig. 2 is the block diagram of the terrestrial reference steady arm of the Autonomous Landing of UAV system illustrated according to an embodiment of the present invention.
Embodiment
Below with reference to accompanying drawings the embodiment of Autonomous Landing of UAV system of the present invention is described in detail.
Unmanned plane, i.e. unmanned spacecraft, its english abbreviation is " UAV ", is the not manned aircraft utilizing radio robot to handle with the presetting apparatus provided for oneself.Can be divided into from technical standpoint definition: this several large class of depopulated helicopter, unmanned fixed-wing aircraft, unmanned multi-rotor aerocraft, unmanned airship, unmanned parasol.Military unmanned air vehicle and civilian unmanned plane can be divided into from the classification of purposes aspect.Military aspect, can be used for battle reconnaissance and supervision, positioning school are penetrated, injured assessment, electronic warfare, and civilian aspect, can be used for border patrol, nuclear radiation detection, aeroplane photography, mineral exploration aviation, the condition of a disaster supervision, traffic patrolling and security monitoring.
Unmanned plane is in the application of military aspect without the need to many speeches, and at present, unmanned plane is also more and more extensive in the application in civilian.Due to the singularity of UAV Flight Control, often easily carry out flight to control when taking off, and to fly to specific landing point due to needs when landing and namely reclaiming, such as landing field cursor position, thus higher to the requirement of flight control, the difficulty that flight controls is also larger, how to reduce manual operation when UAV Landing, the high-precision full-automatic realized under various landform lands, and is one of each unmanned plane company urgent problem.
UAV Landing mode of the prior art cannot accomplish the efficient balance between high precision and low expense, and cannot realize the Autonomous Landing of UAV under all landform, and its intelligent level is not high.For this reason, the present invention has built a kind of Autonomous Landing of UAV system, adopt various image processing equipment targetedly to improve the precision of demarcation position, landing field, meanwhile, the conbined usage of landform recognition technology and wireless communication technology meets the landing requirement under different terrain.
Fig. 1 is the block diagram of the Autonomous Landing of UAV system illustrated according to an embodiment of the present invention, described system comprises unmanned plane driving arrangement 1, master controller 2, aerial camera 3 and terrestrial reference steady arm 4, described aerial camera 3 is connected with described terrestrial reference steady arm 4, and described master controller 2 is connected respectively with described unmanned plane driving arrangement 1, described aerial camera 3 and described terrestrial reference steady arm 4.
Wherein, doubtful landmark region under described aerial camera 3 pairs of machines is taken to obtain doubtful landmark image, described terrestrial reference steady arm 4 carries out image procossing to obtain the unmanned plane distance relative height of terrestrial reference and relative positioning distance when determining to there is terrestrial reference in described doubtful landmark image to described doubtful landmark image, described master controller 2 based on unmanned plane driving arrangement 1 described in described relative height and described relative positioning distance controlling with drive described UAV Landing to put on.
Then, continue to be further detailed the concrete structure of Autonomous Landing of UAV system of the present invention.
Described system also comprises: landform recognizer, be connected with described aerial camera 3, for the type based on landform under described doubtful landmark image determination machine, under described machine, the type of landform comprises Plain, meadow, mountain area, desert, naval vessels, Gobi desert, city and hills.
Described system also comprises: wireless transmitting-receiving equipments, two-way wireless communication link is set up with the unmanned aerial vehicle (UAV) control platform of far-end, be connected respectively with landform recognizer and memory device, the type of landform under described machine is sent to described unmanned aerial vehicle (UAV) control platform, to receive, described unmanned aerial vehicle (UAV) control platform returns, the terrestrial reference corresponding with the type of landform under described machine splits color data, and described terrestrial reference segmentation color data is stored in memory device, described terrestrial reference segmentation color data comprises terrestrial reference R channel range, terrestrial reference G channel range and terrestrial reference channel B scope, described terrestrial reference R channel range, described terrestrial reference G channel range and described terrestrial reference channel B scope are used for the terrestrial reference in RGB image to be separated with RGB image background.
Described system also comprises: GPS locator, is connected with GPS navigation satellite, for receiving the real time positioning data of unmanned plane position.
Described system also comprises: memory device, for prestoring preset height scope, preset pressure elevation weight and default ultrasonic height weight, also for the affine invariant moment features of the terrestrial reference of the benchmark image template and various species that prestore the terrestrial reference of various species, the benchmark image template of the terrestrial reference of each kind is for taking obtained pattern in advance to the benchmark terrestrial reference of each kind, and the affine invariant moment features of the terrestrial reference of each kind extracts the benchmark image template from the terrestrial reference of each kind.
Described system also comprises: highly sensing equipment, is connected with described memory device, comprises pressure-altitude sensor, ultrasonic height sensors and microcontroller; Described pressure-altitude sensor is used for according to the air pressure change near unmanned plane, detects the real-time barometer altitude of unmanned plane position; Described ultrasonic height sensors comprises ultrasound wave emitter, ultrasonic receiver and single-chip microcomputer, described single-chip microcomputer is connected respectively with described ultrasound wave emitter and described ultrasonic receiver, described ultrasound wave emitter launches ultrasound wave earthward, described ultrasonic receiver receives the ultrasound wave of ground return, and described single-chip microcomputer calculates the real-time ultrasound wave height of unmanned plane according to the launch time of described ultrasound wave emitter, the time of reception of described ultrasonic receiver and ultrasonic propagation velocity; Described microcontroller is connected respectively with described pressure-altitude sensor, described ultrasonic height sensors and described memory device, when the difference of described real-time barometer altitude and described real-time ultrasound wave height is in described preset height scope, calculate based on described preset pressure elevation weight, described default ultrasonic height weight, described real-time barometer altitude and described real-time ultrasound wave height and export described real-time height, when the difference of described real-time barometer altitude and described real-time ultrasound wave height is not in described preset height scope, export height detection failure signal.
Described aerial camera 3 is linear array digital aviation video camera, comprises undercarriage having shock absorption function, front cover glass, camera lens, filter and image-forming electron unit, for taking to obtain doubtful landmark image to described doubtful landmark region.
As shown in Figure 2, described terrestrial reference steady arm 4 is connected respectively with described aerial camera 3, described memory device, described GPS locator and described highly sensing equipment, and described terrestrial reference steady arm 4 comprises Image semantic classification subset 41, the segmentation of doubtful terrestrial reference subset 42, terrestrial reference recognin equipment 43 and relative position detect subset 44.
Described Image semantic classification subset 41 carries out contrast enhancement processing, medium filtering process and rgb color space conversion process successively to described doubtful landmark image, to obtain doubtful terrestrial reference RGB image.
Described doubtful terrestrial reference segmentation subset 42 is connected respectively with described Image semantic classification subset 41 and described memory device, calculate the R channel value of each pixel in described doubtful terrestrial reference RGB image, G channel value and channel B value, when the R channel value of a certain pixel in described terrestrial reference R channel range, G channel value in described terrestrial reference G channel range and channel B value within the scope of described terrestrial reference channel B time, be defined as doubtful terrestrial reference pixel, by all doubtful terrestrial reference combination of pixels in described doubtful terrestrial reference RGB image to form doubtful terrestrial reference sub pattern.
Described terrestrial reference recognin equipment 43 splits subset 42 with described doubtful terrestrial reference and described memory device is connected respectively, calculate the affine invariant moment features of described doubtful terrestrial reference sub pattern, described doubtful terrestrial reference sub pattern is mated one by one with the benchmark image template of the terrestrial reference of various species, it fails to match then exports without terrestrial reference signal, the match is successful then obtains the terrestrial reference type of coupling and is compared with the affine invariant moment features of described doubtful terrestrial reference sub pattern by affine invariant moment features corresponding with the terrestrial reference type of coupling in described memory device, difference then exports without terrestrial reference signal, identical, export and there is terrestrial reference signal, the terrestrial reference type of coupling and the image relative position of described doubtful terrestrial reference sub pattern in described doubtful terrestrial reference RGB image.
Described relative position detects subset 44 and is connected respectively with described terrestrial reference recognin equipment 43, described GPS locator and described highly sensing equipment, receive there is terrestrial reference signal time, the relative positioning distance of described unmanned plane distance terrestrial reference is calculated based on described image relative position and described real time positioning data, and using the relative height of described real-time height as described unmanned plane distance terrestrial reference.
Described master controller 2 is connected respectively with described terrestrial reference steady arm 4, described highly sensing equipment and described unmanned plane driving arrangement 1, receive described height detection failure signal or described without terrestrial reference signal time, by described height detection failure signal or be describedly transmitted to described unmanned aerial vehicle (UAV) control platform without terrestrial reference signal by described wireless transmitting-receiving equipments; Described master controller 2 receive described relative positioning distance and receive described relative height time, based on relative height described in described relative positioning Distance geometry control described unmanned plane driving arrangement 1 with drive described UAV Landing to put on.
Wherein, in the system: when under described machine, the type of landform is Plain, in the terrestrial reference segmentation color data that under described machine, the type of landform is corresponding, terrestrial reference R channel range is 150 to 255, terrestrial reference G channel range is 0 to 120, and terrestrial reference channel B scope is 1 to 150; Described aerial camera 3 is ultra high-definition aerial camera, and the resolution of captured doubtful landmark image is 3840 × 2160; Described Image semantic classification subset 41, described doubtful terrestrial reference segmentation subset 42, described terrestrial reference recognin equipment 43 and described relative position detect subset 44 and adopt different fpga chips respectively to realize, alternatively, described Image semantic classification subset 41, described doubtful terrestrial reference segmentation subset 42, described terrestrial reference recognin equipment 43 and described relative position are detected subset 44 and be integrated on one piece of surface-mounted integrated circuit, and alternatively, described unmanned plane is rotor wing unmanned aerial vehicle.
In addition, median filter is a kind of nonlinear digital filter technology, through being usually used in removing the noise in image or other signal.The design philosophy of median filter is exactly check sampling in input signal and judge whether it represents signal, uses the view window of odd-numbered samples composition to realize this function.Numerical value in watch window sorts, and is positioned at the middle intermediate value of view window as output, then, abandons value the earliest, obtain new sampling, repeat computation process above.
In image procossing, before carrying out the further process as rim detection, usually need the noise reduction first carried out to a certain degree.Medium filtering is a general procedure in image procossing, and it is particularly useful for speckle noise and salt-pepper noise.The characteristic of preserving edge makes it not wish to occur that ill-defined occasion is also very useful.
In addition, described memory device can type selecting be SDRAM, i.e. Synchronous Dynamic Random Access Memory, synchronous DRAM, synchronously refer to Memory need of work synchronous clock, the transmission of inner order and the transmission of data all with it for benchmark; Dynamically refer to that storage array needs constantly to refresh to ensure that data are not lost; Refer to that data are not linearly store successively at random, but free assigned address carries out reading and writing data.
SDRAM storer experienced by for four generations now from developing into, respectively: first generation SDRSDRAM, and second generation DDR SDRAM, third generation DDR2SDRAM, forth generation DDR3SDRAM.First generation SDRAM adopts single-ended (Single-Ended) clock signal, the second generation, the third generation and forth generation because frequency of operation is than very fast, so adopt the differential clock signal that can reduce interference as synchronous clock.The clock frequency of SDR SDRAM is exactly the frequency that data store, and first generation internal memory clock frequency is named, and as pc100, pc133 then show that clock signal is 100 or 133MHz, reading and writing data speed is also 100 or 133MHz.Afterwards second, three, four generation DDR (Double Data Rate) internal memory then adopts reading and writing data speed as naming standard, and adds the symbol representing its DDR algebraically above, PC1=DDR, PC2=DDR2, PC3=DDR3.If PC2700 is DDR333, its frequency of operation is 333/2=166MHz, and 2700 represent that bandwidth is 2.7G.The read-write frequency of DDR is from DDR200 to DDR400, and DDR2 is from DDR2-400 to DDR2-800, and DDR3 is from DDR3-800 to DDR3-1600.
Adopt Autonomous Landing of UAV system of the present invention, the not high technical matters of cost performance and high precision and intelligent level cannot be taken into account for existing UAV Landing technology, the requirement that the mode combined meets high precision and high performance-price ratio is mated by adopting the affine invariant moment features of terrestrial reference template matches and terrestrial reference pattern, by the coordination of landform recognition technology and wireless communication technology use achieve various landform under the target search of various landing fields, the relative position finally based on the ground subject distance unmanned plane searched drives unmanned plane automatically to fly to expection landing point.
Be understandable that, although the present invention with preferred embodiment disclose as above, but above-described embodiment and be not used to limit the present invention.For any those of ordinary skill in the art, do not departing under technical solution of the present invention ambit, the technology contents of above-mentioned announcement all can be utilized to make many possible variations and modification to technical solution of the present invention, or be revised as the Equivalent embodiments of equivalent variations.Therefore, every content not departing from technical solution of the present invention, according to technical spirit of the present invention to any simple modification made for any of the above embodiments, equivalent variations and modification, all still belongs in the scope of technical solution of the present invention protection.
Claims (7)
1. an Autonomous Landing of UAV system, it is characterized in that, described system comprises aerial camera, terrestrial reference steady arm, unmanned plane driving arrangement and master controller, described aerial camera is taken to obtain doubtful landmark image to the doubtful landmark region under machine, described terrestrial reference steady arm carries out image procossing to obtain relative height and the relative positioning distance of unmanned plane distance terrestrial reference when determining to there is terrestrial reference in described doubtful landmark image to described doubtful landmark image, described master controller is connected respectively with described terrestrial reference steady arm and described unmanned plane driving arrangement, based on unmanned plane driving arrangement described in described relative height and described relative positioning distance controlling with drive described UAV Landing to put on.
2. Autonomous Landing of UAV system as claimed in claim 1, it is characterized in that, described system also comprises:
Landform recognizer, is connected with described aerial camera, and for the type based on landform under described doubtful landmark image determination machine, under described machine, the type of landform comprises Plain, meadow, mountain area, desert, naval vessels, Gobi desert, city and hills;
Wireless transmitting-receiving equipments, two-way wireless communication link is set up with the unmanned aerial vehicle (UAV) control platform of far-end, be connected respectively with landform recognizer and memory device, the type of landform under described machine is sent to described unmanned aerial vehicle (UAV) control platform, to receive, described unmanned aerial vehicle (UAV) control platform returns, the terrestrial reference corresponding with the type of landform under described machine splits color data, and described terrestrial reference segmentation color data is stored in memory device, described terrestrial reference segmentation color data comprises terrestrial reference R channel range, terrestrial reference G channel range and terrestrial reference channel B scope, described terrestrial reference R channel range, described terrestrial reference G channel range and described terrestrial reference channel B scope are used for the terrestrial reference in RGB image to be separated with RGB image background,
GPS locator, is connected with GPS navigation satellite, for receiving the real time positioning data of unmanned plane position;
Memory device, for prestoring preset height scope, preset pressure elevation weight and default ultrasonic height weight, also for the affine invariant moment features of the terrestrial reference of the benchmark image template and various species that prestore the terrestrial reference of various species, the benchmark image template of the terrestrial reference of each kind is for taking obtained pattern in advance to the benchmark terrestrial reference of each kind, and the affine invariant moment features of the terrestrial reference of each kind extracts the benchmark image template from the terrestrial reference of each kind;
Highly sensing equipment, is connected with described memory device, comprises pressure-altitude sensor, ultrasonic height sensors and microcontroller; Described pressure-altitude sensor is used for according to the air pressure change near unmanned plane, detects the real-time barometer altitude of unmanned plane position; Described ultrasonic height sensors comprises ultrasound wave emitter, ultrasonic receiver and single-chip microcomputer, described single-chip microcomputer is connected respectively with described ultrasound wave emitter and described ultrasonic receiver, described ultrasound wave emitter launches ultrasound wave earthward, described ultrasonic receiver receives the ultrasound wave of ground return, and described single-chip microcomputer calculates the real-time ultrasound wave height of unmanned plane according to the launch time of described ultrasound wave emitter, the time of reception of described ultrasonic receiver and ultrasonic propagation velocity; Described microcontroller is connected respectively with described pressure-altitude sensor, described ultrasonic height sensors and described memory device, when the difference of described real-time barometer altitude and described real-time ultrasound wave height is in described preset height scope, calculate based on described preset pressure elevation weight, described default ultrasonic height weight, described real-time barometer altitude and described real-time ultrasound wave height and export described real-time height, when the difference of described real-time barometer altitude and described real-time ultrasound wave height is not in described preset height scope, export height detection failure signal;
Described aerial camera is linear array digital aviation video camera, comprises undercarriage having shock absorption function, front cover glass, camera lens, filter and image-forming electron unit, for taking to obtain doubtful landmark image to described doubtful landmark region;
Described terrestrial reference steady arm is connected respectively with described aerial camera, described memory device, described GPS locator and described highly sensing equipment, comprises Image semantic classification subset, the segmentation of doubtful terrestrial reference subset, terrestrial reference recognin equipment and relative position detect subset, described Image semantic classification subset carries out contrast enhancement processing, medium filtering process and rgb color space conversion process successively to described doubtful landmark image, to obtain doubtful terrestrial reference RGB image, described doubtful terrestrial reference segmentation subset is connected respectively with described Image semantic classification subset and described memory device, calculate the R channel value of each pixel in described doubtful terrestrial reference RGB image, G channel value and channel B value, when the R channel value of a certain pixel in described terrestrial reference R channel range, G channel value in described terrestrial reference G channel range and channel B value within the scope of described terrestrial reference channel B time, be defined as doubtful terrestrial reference pixel, by all doubtful terrestrial reference combination of pixels in described doubtful terrestrial reference RGB image to form doubtful terrestrial reference sub pattern, described terrestrial reference recognin equipment splits subset with described doubtful terrestrial reference and described memory device is connected respectively, calculate the affine invariant moment features of described doubtful terrestrial reference sub pattern, described doubtful terrestrial reference sub pattern is mated one by one with the benchmark image template of the terrestrial reference of various species, it fails to match then exports without terrestrial reference signal, the match is successful then obtains the terrestrial reference type of coupling and is compared with the affine invariant moment features of described doubtful terrestrial reference sub pattern by affine invariant moment features corresponding with the terrestrial reference type of coupling in described memory device, difference then exports without terrestrial reference signal, identical, export and there is terrestrial reference signal, the terrestrial reference type of coupling and the image relative position of described doubtful terrestrial reference sub pattern in described doubtful terrestrial reference RGB image, described relative position detects subset and is connected respectively with described terrestrial reference recognin equipment, described GPS locator and described highly sensing equipment, receive there is terrestrial reference signal time, the relative positioning distance of described unmanned plane distance terrestrial reference is calculated based on described image relative position and described real time positioning data, and using the relative height of described real-time height as described unmanned plane distance terrestrial reference,
Described master controller is connected respectively with described terrestrial reference steady arm, described highly sensing equipment and described unmanned plane driving arrangement, receive described height detection failure signal or described without terrestrial reference signal time, by described height detection failure signal or be describedly transmitted to described unmanned aerial vehicle (UAV) control platform without terrestrial reference signal by described wireless transmitting-receiving equipments; Described master controller receive described relative positioning distance and receive described relative height time, based on relative height described in described relative positioning Distance geometry control described unmanned plane driving arrangement with drive described UAV Landing to put on.
3. Autonomous Landing of UAV system as claimed in claim 2, is characterized in that:
When under described machine, the type of landform is Plain, in the terrestrial reference segmentation color data that under described machine, the type of landform is corresponding, terrestrial reference R channel range is 150 to 255, and terrestrial reference G channel range is 0 to 120, and terrestrial reference channel B scope is 1 to 150.
4. Autonomous Landing of UAV system as claimed in claim 2, is characterized in that:
Described aerial camera is ultra high-definition aerial camera, and the resolution of captured doubtful landmark image is 3840 × 2160.
5. Autonomous Landing of UAV system as claimed in claim 2, is characterized in that:
Described Image semantic classification subset, described doubtful terrestrial reference segmentation subset, described terrestrial reference recognin equipment and described relative position detect subset and adopt different fpga chips respectively to realize.
6. Autonomous Landing of UAV system as claimed in claim 2, is characterized in that:
Described Image semantic classification subset, described doubtful terrestrial reference segmentation subset, described terrestrial reference recognin equipment and described relative position are detected subset and be integrated on one piece of surface-mounted integrated circuit.
7. Autonomous Landing of UAV system as claimed in claim 2, is characterized in that:
Described unmanned plane is rotor wing unmanned aerial vehicle.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610263384.5A CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510574773.5A CN105068553A (en) | 2015-03-10 | 2015-03-10 | Unmanned aerial vehicle automatic landing system |
CN201510104994.6A CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510574657.3A CN105182995B (en) | 2015-03-10 | 2015-03-10 | Autonomous Landing of UAV system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510104994.6A CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610263384.5A Division CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201610263903.8A Division CN105676875A (en) | 2015-03-10 | 2015-03-10 | Automatic landing system of unmanned aerial vehicle |
CN201510574657.3A Division CN105182995B (en) | 2015-03-10 | 2015-03-10 | Autonomous Landing of UAV system |
CN201510574773.5A Division CN105068553A (en) | 2015-03-10 | 2015-03-10 | Unmanned aerial vehicle automatic landing system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104679013A true CN104679013A (en) | 2015-06-03 |
Family
ID=53314240
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510574773.5A Pending CN105068553A (en) | 2015-03-10 | 2015-03-10 | Unmanned aerial vehicle automatic landing system |
CN201510104994.6A Pending CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510574657.3A Expired - Fee Related CN105182995B (en) | 2015-03-10 | 2015-03-10 | Autonomous Landing of UAV system |
CN201610263384.5A Withdrawn CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510574773.5A Pending CN105068553A (en) | 2015-03-10 | 2015-03-10 | Unmanned aerial vehicle automatic landing system |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510574657.3A Expired - Fee Related CN105182995B (en) | 2015-03-10 | 2015-03-10 | Autonomous Landing of UAV system |
CN201610263384.5A Withdrawn CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
Country Status (1)
Country | Link |
---|---|
CN (4) | CN105068553A (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105416515A (en) * | 2015-12-07 | 2016-03-23 | 武汉理工大学 | Carrying device for quad-rotor unmanned aerial vehicle |
CN105527973A (en) * | 2016-01-15 | 2016-04-27 | 无锡觅睿恪科技有限公司 | Unmanned aerial vehicle automatic landing system |
CN105676875A (en) * | 2015-03-10 | 2016-06-15 | 张超 | Automatic landing system of unmanned aerial vehicle |
CN106153008A (en) * | 2016-06-17 | 2016-11-23 | 北京理工大学 | A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model |
WO2017080028A1 (en) * | 2015-11-14 | 2017-05-18 | 深圳市易特科信息技术有限公司 | Unmanned aerial vehicle system for positioning source of nuclear radiation |
WO2017115120A1 (en) * | 2015-12-29 | 2017-07-06 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
CN107478244A (en) * | 2017-06-23 | 2017-12-15 | 中国民航大学 | The unmanned plane check system and method for a kind of instrument-landing-system |
CN107924196A (en) * | 2015-07-16 | 2018-04-17 | 赛峰电子与防务公司 | The method landed for automatic auxiliary aviation device |
CN108594848A (en) * | 2018-03-29 | 2018-09-28 | 上海交通大学 | A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage |
CN108983811A (en) * | 2018-07-23 | 2018-12-11 | 南京森林警察学院 | A kind of dual-purpose unmanned plane device based on FPGA |
US10220954B2 (en) | 2015-01-04 | 2019-03-05 | Zero Zero Robotics Inc | Aerial system thermal control system and method |
US10222800B2 (en) | 2015-01-04 | 2019-03-05 | Hangzhou Zero Zero Technology Co., Ltd | System and method for automated aerial system operation |
CN109791414A (en) * | 2016-09-26 | 2019-05-21 | 深圳市大疆创新科技有限公司 | The method and system that view-based access control model lands |
US10358214B2 (en) | 2015-01-04 | 2019-07-23 | Hangzhou Zero Zro Technology Co., Ltd. | Aerial vehicle and method of operation |
US10435144B2 (en) | 2016-04-24 | 2019-10-08 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
CN110322462A (en) * | 2019-06-13 | 2019-10-11 | 暨南大学 | Unmanned aerial vehicle vision based on 5G network feels land method and system |
CN110383196A (en) * | 2018-07-02 | 2019-10-25 | 深圳市大疆创新科技有限公司 | Unmanned plane makes a return voyage the method, apparatus and unmanned plane of control |
CN110989687A (en) * | 2019-11-08 | 2020-04-10 | 上海交通大学 | Unmanned aerial vehicle landing method based on nested square visual information |
US10719080B2 (en) | 2015-01-04 | 2020-07-21 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system and detachable housing |
US10824167B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3387506B1 (en) | 2015-12-09 | 2021-01-13 | SZ DJI Technology Co., Ltd. | Systems and methods for auto-return |
CN113342038A (en) * | 2016-02-29 | 2021-09-03 | 星克跃尔株式会社 | Method and system for generating a map for flight of an unmanned aerial vehicle |
US9454154B1 (en) * | 2016-03-07 | 2016-09-27 | Amazon Technologies, Inc. | Incident light sensor on autonomous vehicle |
CN105629996A (en) * | 2016-03-22 | 2016-06-01 | 昆明天龙经纬电子科技有限公司 | Unmanned aerial vehicle fixed-point landing guiding method and system |
JP2017185758A (en) * | 2016-04-08 | 2017-10-12 | 東芝テック株式会社 | Printing device and printing method |
CN109562844B (en) * | 2016-08-06 | 2022-03-01 | 深圳市大疆创新科技有限公司 | Automated landing surface topography assessment and related systems and methods |
JP2018069961A (en) * | 2016-10-31 | 2018-05-10 | 株式会社エンルートM’s | Device arrangement device, device arrangement method, and device arrangement program |
CN107399440A (en) * | 2017-07-27 | 2017-11-28 | 北京航空航天大学 | Aircraft lands method and servicing unit |
US11829162B2 (en) | 2019-08-15 | 2023-11-28 | Teledyne Flir Detection, Inc. | Unmanned aerial vehicle locking landing pad |
US11767110B2 (en) | 2019-12-16 | 2023-09-26 | FLIR Unmanned Aerial Systems AS | System for storing, autonomously launching and landing unmanned aerial vehicles |
CN113504794B (en) * | 2021-07-23 | 2023-04-21 | 中国科学院地理科学与资源研究所 | Unmanned aerial vehicle cluster reconstruction method, unmanned aerial vehicle cluster reconstruction system and electronic equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000243A (en) * | 2007-01-16 | 2007-07-18 | 北京航空航天大学 | Pilotless plane landing navigation method and its device |
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
CN102156480A (en) * | 2010-12-30 | 2011-08-17 | 清华大学 | Unmanned helicopter independent landing method based on natural landmark and vision navigation |
CN102538782A (en) * | 2012-01-04 | 2012-07-04 | 浙江大学 | Helicopter landing guide device and method based on computer vision |
CN102854882A (en) * | 2012-09-21 | 2013-01-02 | 苏州工业园区职业技术学院 | Automatic control system of three-wing two-paddle recombination type unmanned aerial vehicle (UAV) |
CN103226356A (en) * | 2013-02-27 | 2013-07-31 | 广东工业大学 | Image-processing-based unmanned plane accurate position landing method |
CN103809598A (en) * | 2014-03-12 | 2014-05-21 | 北京航空航天大学 | Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground |
CN203806131U (en) * | 2013-01-28 | 2014-09-03 | 上海科斗电子科技有限公司 | Gasbag flight device height adjusting device |
CN104166854A (en) * | 2014-08-03 | 2014-11-26 | 浙江大学 | Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle |
CN204406209U (en) * | 2015-03-10 | 2015-06-17 | 无锡桑尼安科技有限公司 | Autonomous Landing of UAV system |
-
2015
- 2015-03-10 CN CN201510574773.5A patent/CN105068553A/en active Pending
- 2015-03-10 CN CN201510104994.6A patent/CN104679013A/en active Pending
- 2015-03-10 CN CN201510574657.3A patent/CN105182995B/en not_active Expired - Fee Related
- 2015-03-10 CN CN201610263384.5A patent/CN105955289A/en not_active Withdrawn
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000243A (en) * | 2007-01-16 | 2007-07-18 | 北京航空航天大学 | Pilotless plane landing navigation method and its device |
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
CN102156480A (en) * | 2010-12-30 | 2011-08-17 | 清华大学 | Unmanned helicopter independent landing method based on natural landmark and vision navigation |
CN102538782A (en) * | 2012-01-04 | 2012-07-04 | 浙江大学 | Helicopter landing guide device and method based on computer vision |
CN102854882A (en) * | 2012-09-21 | 2013-01-02 | 苏州工业园区职业技术学院 | Automatic control system of three-wing two-paddle recombination type unmanned aerial vehicle (UAV) |
CN203806131U (en) * | 2013-01-28 | 2014-09-03 | 上海科斗电子科技有限公司 | Gasbag flight device height adjusting device |
CN103226356A (en) * | 2013-02-27 | 2013-07-31 | 广东工业大学 | Image-processing-based unmanned plane accurate position landing method |
CN103809598A (en) * | 2014-03-12 | 2014-05-21 | 北京航空航天大学 | Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground |
CN104166854A (en) * | 2014-08-03 | 2014-11-26 | 浙江大学 | Vision grading landmark locating and identifying method for autonomous landing of small unmanned aerial vehicle |
CN204406209U (en) * | 2015-03-10 | 2015-06-17 | 无锡桑尼安科技有限公司 | Autonomous Landing of UAV system |
Non-Patent Citations (1)
Title |
---|
陈盛福: "基于视觉的无人机自动着陆相关技术研究", 《中国优秀硕士学位论文全文数据库 工程科技II辑》 * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10358214B2 (en) | 2015-01-04 | 2019-07-23 | Hangzhou Zero Zro Technology Co., Ltd. | Aerial vehicle and method of operation |
US10220954B2 (en) | 2015-01-04 | 2019-03-05 | Zero Zero Robotics Inc | Aerial system thermal control system and method |
US10824149B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US10824167B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US10719080B2 (en) | 2015-01-04 | 2020-07-21 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system and detachable housing |
US10528049B2 (en) | 2015-01-04 | 2020-01-07 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US10222800B2 (en) | 2015-01-04 | 2019-03-05 | Hangzhou Zero Zero Technology Co., Ltd | System and method for automated aerial system operation |
CN105676875A (en) * | 2015-03-10 | 2016-06-15 | 张超 | Automatic landing system of unmanned aerial vehicle |
CN107924196A (en) * | 2015-07-16 | 2018-04-17 | 赛峰电子与防务公司 | The method landed for automatic auxiliary aviation device |
CN107924196B (en) * | 2015-07-16 | 2021-03-09 | 赛峰电子与防务公司 | Method for automatically assisting an aircraft landing |
WO2017080028A1 (en) * | 2015-11-14 | 2017-05-18 | 深圳市易特科信息技术有限公司 | Unmanned aerial vehicle system for positioning source of nuclear radiation |
CN105416515A (en) * | 2015-12-07 | 2016-03-23 | 武汉理工大学 | Carrying device for quad-rotor unmanned aerial vehicle |
WO2017115120A1 (en) * | 2015-12-29 | 2017-07-06 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
EP3398021A4 (en) * | 2015-12-29 | 2019-09-04 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
CN105527973A (en) * | 2016-01-15 | 2016-04-27 | 无锡觅睿恪科技有限公司 | Unmanned aerial vehicle automatic landing system |
JP2019505902A (en) * | 2016-04-22 | 2019-02-28 | ハンチョウ ゼロ ゼロ テクノロジー カンパニー リミテッドHangzhou Zero Zero Technology Co.,Ltd. | System and method for operating an automated aircraft system |
US10435144B2 (en) | 2016-04-24 | 2019-10-08 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
US11027833B2 (en) | 2016-04-24 | 2021-06-08 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
CN106153008B (en) * | 2016-06-17 | 2018-04-06 | 北京理工大学 | A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model |
CN106153008A (en) * | 2016-06-17 | 2016-11-23 | 北京理工大学 | A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model |
CN109791414A (en) * | 2016-09-26 | 2019-05-21 | 深圳市大疆创新科技有限公司 | The method and system that view-based access control model lands |
US11604479B2 (en) | 2016-09-26 | 2023-03-14 | SZ DJI Technology Co., Ltd. | Methods and system for vision-based landing |
CN107478244A (en) * | 2017-06-23 | 2017-12-15 | 中国民航大学 | The unmanned plane check system and method for a kind of instrument-landing-system |
CN108594848B (en) * | 2018-03-29 | 2021-01-22 | 上海交通大学 | Unmanned aerial vehicle staged autonomous landing method based on visual information fusion |
CN108594848A (en) * | 2018-03-29 | 2018-09-28 | 上海交通大学 | A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage |
CN110383196A (en) * | 2018-07-02 | 2019-10-25 | 深圳市大疆创新科技有限公司 | Unmanned plane makes a return voyage the method, apparatus and unmanned plane of control |
US11783716B2 (en) | 2018-07-02 | 2023-10-10 | SZ DJI Technology Co., Ltd. | Return flight control method and device for unmanned aerial vehicle, and unmanned aerial vehicle |
CN108983811A (en) * | 2018-07-23 | 2018-12-11 | 南京森林警察学院 | A kind of dual-purpose unmanned plane device based on FPGA |
CN110322462A (en) * | 2019-06-13 | 2019-10-11 | 暨南大学 | Unmanned aerial vehicle vision based on 5G network feels land method and system |
CN110989687A (en) * | 2019-11-08 | 2020-04-10 | 上海交通大学 | Unmanned aerial vehicle landing method based on nested square visual information |
CN110989687B (en) * | 2019-11-08 | 2021-08-10 | 上海交通大学 | Unmanned aerial vehicle landing method based on nested square visual information |
Also Published As
Publication number | Publication date |
---|---|
CN105182995B (en) | 2016-09-07 |
CN105955289A (en) | 2016-09-21 |
CN105182995A (en) | 2015-12-23 |
CN105068553A (en) | 2015-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN204406209U (en) | Autonomous Landing of UAV system | |
CN104679013A (en) | Unmanned plane automatic landing system | |
Al-Kaff et al. | Survey of computer vision algorithms and applications for unmanned aerial vehicles | |
CN105865454B (en) | A kind of Navigation of Pilotless Aircraft method generated based on real-time online map | |
Colomina et al. | Towards a new paradigm for high-resolution low-cost photogrammetryand remote sensing | |
Qi et al. | Search and rescue rotary‐wing uav and its application to the lushan ms 7.0 earthquake | |
Samad et al. | The potential of Unmanned Aerial Vehicle (UAV) for civilian and mapping application | |
CN103822635B (en) | The unmanned plane during flying spatial location real-time computing technique of view-based access control model information | |
CN102435188B (en) | Monocular vision/inertia autonomous navigation method for indoor environment | |
CN102426019B (en) | Unmanned aerial vehicle scene matching auxiliary navigation method and system | |
Quist et al. | Radar odometry on fixed-wing small unmanned aircraft | |
US20150051758A1 (en) | Method and System for Landing of Unmanned Aerial Vehicle | |
CN103809598A (en) | Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground | |
CN103954283A (en) | Scene matching/visual odometry-based inertial integrated navigation method | |
CN203376646U (en) | Low-altitude remote sensing monitoring system based on combination of 3S technology and unmanned aerial vehicle | |
CN102381481A (en) | Unmanned aerial vehicle-mounted laser radar system | |
Haarbrink et al. | Helicopter UAV for photogrammetry and rapid response | |
CN112596071A (en) | Unmanned aerial vehicle autonomous positioning method and device and unmanned aerial vehicle | |
Wilson et al. | Vision‐aided Guidance and Navigation for Close Formation Flight | |
CN112379681A (en) | Unmanned aerial vehicle obstacle avoidance flight method and device and unmanned aerial vehicle | |
Papa | Embedded platforms for UAS landing path and obstacle detection | |
Kong et al. | A ground-based multi-sensor system for autonomous landing of a fixed wing UAV | |
Williams et al. | All-source navigation for enhancing UAV operations in GPS-denied environments | |
Zhang et al. | Infrared-based autonomous navigation for civil aircraft precision approach and landing | |
CN105676875A (en) | Automatic landing system of unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20150603 |