CN105955289A - Unmanned plane automatic landing system - Google Patents
Unmanned plane automatic landing system Download PDFInfo
- Publication number
- CN105955289A CN105955289A CN201610263384.5A CN201610263384A CN105955289A CN 105955289 A CN105955289 A CN 105955289A CN 201610263384 A CN201610263384 A CN 201610263384A CN 105955289 A CN105955289 A CN 105955289A
- Authority
- CN
- China
- Prior art keywords
- terrestrial reference
- doubtful
- image
- unmanned plane
- landmark
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012545 processing Methods 0.000 claims abstract description 5
- 238000002604 ultrasonography Methods 0.000 claims description 29
- 238000001514 detection method Methods 0.000 claims description 23
- 230000011218 segmentation Effects 0.000 claims description 23
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 17
- 230000004807 localization Effects 0.000 claims description 13
- 238000000034 method Methods 0.000 claims description 13
- 230000008878 coupling Effects 0.000 claims description 9
- 238000010168 coupling process Methods 0.000 claims description 9
- 238000005859 coupling reaction Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 4
- 238000010521 absorption reaction Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 239000006059 cover glass Substances 0.000 claims description 3
- 230000035939 shock Effects 0.000 claims description 3
- 238000000926 separation method Methods 0.000 claims description 2
- 238000000605 extraction Methods 0.000 claims 1
- 206010063385 Intellectualisation Diseases 0.000 abstract 1
- 238000005516 engineering process Methods 0.000 description 11
- 241000894007 species Species 0.000 description 6
- 238000012544 monitoring process Methods 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 206010024796 Logorrhoea Diseases 0.000 description 1
- 239000006002 Pepper Substances 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 229910052500 inorganic mineral Inorganic materials 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 239000011707 mineral Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention relates to an unmanned plane automatic landing system comprising an aerial camera, a landmark locator, an unmanned plane driving device, and a main controller. The aerial camera is used to shoot a suspected landmark area under the unmanned plane to acquire a suspected landmark image, and the landmark locator is used for the image processing of the suspected landmark image, and when a landmark exists in the suspected landmark image, the relative height of the unmanned plane with respect to the landmark and the relative locating distance between the unmanned plane and the landmark are acquired. The main controller is respectively connected with the landmark locator and the unmanned plane driving device, and under the control of the main controller, the unmanned plane driving device is used to drive the unmanned plane to land on the landmark according to the relative height and the relative locating distance. The unmanned plane automatic landing system is advantageous in that the automatic and accurate landing of the unmanned plane is completed under different topographical conditions, and the intellectualization level of the unmanned plane is improved.
Description
The present invention is Application No. 201510104994.6, filing date March 10 in 2015 day,
The divisional application of the patent of bright entitled " Autonomous Landing of UAV system ".
Technical field
The present invention relates to Navigation of Pilotless Aircraft field, particularly relate to a kind of Autonomous Landing of UAV system.
Background technology
Along with reaching its maturity and the further expansion of air photo technique, in military nothing of unmanned air vehicle technique
In man-machine application, unmanned plane is usually used in investigating the combat support of the forms such as supervision, more it is essential that
The application of civilian unmanned plane is the most increasingly extensive, including: photogrammetric, emergency disaster relief, public peace
Entirely, resource exploration, environmental monitoring, Natural calamity monitoring and assessment, urban planning and municipal administration,
The protection of forest fires pest and disease damage and monitoring etc..
Owing to unmanned plane plays the most important effect in every field, people are to unmanned plane during flying
Intelligent requirements more and more higher.In general, the control of the aspect of taking off of unmanned plane is more convenient,
And the landing of unmanned plane is reclaimed and is difficult to control to, but unmanned plane is completed set by the independent landing of unmanned plane
Reclaiming after task and significant, the accuracy therefore improving unmanned plane independent landing can be greatly
Ground improves the intelligent level of unmanned plane.
In prior art, the independent landing of unmanned plane uses inertial navigation system INS, GPS navigation system
System or INS and GPS integrated navigation system complete.But it is the apparatus expensive of INS, simple
GPS navigation system precision is the highest, and inertial navigation system INS, GPS navigation system or INS
The full intellectualized of landing all can not be realized with GPS integrated navigation system, it addition, existing unmanned plane
Landing Control is confined to the terrestrial reference of specific landform and finds, position and land, it is impossible to realize all landform
Under Autonomous Landing of UAV.
Accordingly, it would be desirable to a kind of new Autonomous Landing of UAV scheme, it is possible to obtain between cost and precision
Obtain equilibrium well, there is preferable cost performance, meanwhile, it is capable to search out smoothly under various landform
Landing terrestrial reference also completely automatically realizes the uneventful landing of unmanned plane.
Summary of the invention
In order to solve the problems referred to above, the invention provides a kind of Autonomous Landing of UAV system, by adopting
Mate, with the affine invariant moment features of terrestrial reference template matching and terrestrial reference pattern, the mode combined accurately to seek
Find landing terrestrial reference, obtain unmanned by airmanship, height measurement technique and image processing techniques
Machine distance landing field target made relative to position, the technology of landform identification simultaneously and combining of wireless communication technology
With the landing field target search achieved under different terrain, whole landing system is interfered without the external world, complete
Full-automatic realization, and there is suitable cost performance.
According to an aspect of the present invention, it is provided that a kind of Autonomous Landing of UAV system, described system bag
Include aerial camera, terrestrial reference localizer, unmanned plane drive equipment and master controller, described in take photo by plane shooting
Doubtful landmark region under machine-to-machine carries out shooting to obtain doubtful landmark image, described terrestrial reference localizer
Described doubtful landmark image is carried out image procossing there is ground in determining described doubtful landmark image
Timestamp obtains relative altitude and relative localization distance, described master controller and the institute of unmanned plane distance terrestrial reference
Stating terrestrial reference localizer and described unmanned plane drives equipment to connect respectively, based on described relative altitude and described
Described in relative localization distance controlling unmanned plane drive equipment with drive described UAV Landing to put on.
More specifically, in described Autonomous Landing of UAV system, also include: landform evaluator, with
Described aerial camera connects, for determining the type of landform under machine based on described doubtful landmark image,
Under described machine the type of landform include Plain, meadow, mountain area, desert, naval vessels, Gobi desert, city and
Hills;Wireless transmitting-receiving equipments, sets up two-way wireless communication link with the unmanned aerial vehicle (UAV) control platform of far-end,
It is connected respectively with landform evaluator and storage device, the type of landform under described machine is sent to described nothing
Human-machine Control platform, to receive the class of landform under that described unmanned aerial vehicle (UAV) control platform returns and described machine
The terrestrial reference segmentation color data that type is corresponding, and described terrestrial reference segmentation color data is stored storage device
In, described terrestrial reference segmentation color data includes terrestrial reference R channel range, terrestrial reference G channel range and terrestrial reference
Channel B scope, described terrestrial reference R channel range, described terrestrial reference G channel range and described terrestrial reference B
Channel range is for by the terrestrial reference in RGB image and RGB image background separation;GPS locator,
It is connected with GPS navigation satellite, for receiving the real time positioning data of unmanned plane position;Storage sets
Standby, it is used for prestoring preset height scope, preset pressure elevation weight and default ultrasonic height power
Weight, is additionally operable to prestore the terrestrial reference of the benchmark image template of the terrestrial reference of various species and various species
Affine invariant moment features, the benchmark image template of the terrestrial reference of each kind is the base to each kind
Quasi-terrestrial reference shoots obtained pattern in advance, and the affine invariant moment features of the terrestrial reference of each kind extracts
From the benchmark image template of the terrestrial reference of each kind;Highly sensing equipment, with described storage device even
Connect, including pressure-altitude sensor, ultrasonic height sensors and microcontroller;Described pressure altitude
Sensor is for according to the air pressure change near unmanned plane, the real-time air pressure of detection unmanned plane position
Highly;Described ultrasonic height sensors includes ultrasound wave emitter, ultrasonic receiver and single-chip microcomputer,
Described single-chip microcomputer is connected respectively with described ultrasound wave emitter and described ultrasonic receiver, described ultrasonic
Wave transmitter launches ultrasound wave earthward, and described ultrasonic receiver receives the ultrasound wave of ground return,
Described single-chip microcomputer is according to launch time of described ultrasound wave emitter, the reception of described ultrasonic receiver
Time and ultrasonic propagation velocity calculate the real-time ultrasound wave height of unmanned plane;Described microcontroller and institute
State pressure-altitude sensor, described ultrasonic height sensors and described storage device to connect respectively, when
The difference of described real-time pressure altitude and described real-time ultrasound wave height when described preset height scope, base
In described preset pressure elevation weight, described default ultrasonic height weight, described real-time pressure altitude
Calculate and export described real-time height with described real-time ultrasound wave height, when described real-time pressure altitude and
The difference of described real-time ultrasound wave height is not when described preset height scope, and output height detection is unsuccessfully believed
Number;Described aerial camera is linear array digital aviation camera, including undercarriage having shock absorption function, front cover glass,
Camera lens, filter and image-forming electron unit, doubt to obtain for shooting described doubtful landmark region
As logo image;Described terrestrial reference localizer and described aerial camera, described storage device, described GPS
Localizer and described highly sensing equipment connect respectively, including Image semantic classification subset, doubtful terrestrial reference
Segmentation subset, terrestrial reference identification subset detect subset with relative position;Described Image semantic classification
Equipment carries out contrast enhancement processing successively to described doubtful landmark image, medium filtering processes and RGB
Color space conversion process, to obtain doubtful terrestrial reference RGB image;Described doubtful terrestrial reference segmentation sets
For being connected respectively with described Image semantic classification subset and described storage device, calculate described doubtful terrestrial reference
The R channel value of each pixel, G channel value and channel B value in RGB image, when a certain pixel
R channel value in described terrestrial reference R channel range, G channel value is at described terrestrial reference G channel range
In and channel B value in the range of described terrestrial reference channel B time, be defined as doubtful terrestrial reference pixel, will
In described doubtful terrestrial reference RGB image, all doubtful terrestrial reference combination of pixels are to form doubtful terrestrial reference sub pattern;
Described terrestrial reference identification subset connects respectively with described doubtful terrestrial reference segmentation subset and described storage device
Connect, calculate the affine invariant moment features of described doubtful terrestrial reference sub pattern, by described doubtful terrestrial reference sub pattern
Mating one by one with the benchmark image template of the terrestrial reference of various species, it fails to match then exports without terrestrial reference letter
Number, the match is successful then obtain coupling terrestrial reference type and by described storage device with the terrestrial reference class mated
The affine invariant moment features that type is corresponding compares with the affine invariant moment features of described doubtful terrestrial reference sub pattern,
Difference then exports without terrestrial reference signal, identical, output exist terrestrial reference signal, the terrestrial reference type of coupling and
Described doubtful terrestrial reference sub pattern image in described doubtful terrestrial reference RGB image is relative to position;Described
Position detection subset and described terrestrial reference identification subset, described GPS locator and described height relatively
Sensing equipment connects respectively, receive there is terrestrial reference signal time, based on described image relative to position and
Described real time positioning data calculates the relative localization distance of described unmanned plane distance terrestrial reference, and by described
Height is as the relative altitude of described unmanned plane distance terrestrial reference in real time;Described master controller and described terrestrial reference
Localizer, described highly sensing equipment and described unmanned plane drive equipment to connect respectively, receiving
When stating height detection failure signal or described signal without terrestrial reference, by described height detection failure signal or institute
State and be transmitted to described unmanned aerial vehicle (UAV) control platform without terrestrial reference signal by described wireless transmitting-receiving equipments;Described master
Controller is when receiving described relative localization distance and receiving described relative altitude, based on described phase
Orientation distance and described relative altitude are controlled described unmanned plane drives equipment to drive described unmanned plane
Put on landing.
More specifically, in described Autonomous Landing of UAV system: when under described machine, the type of landform is
During Plain, in the terrestrial reference segmentation color data that under described machine, the type of landform is corresponding, terrestrial reference R passage model
Enclosing is 150 to 255, and terrestrial reference G channel range is 0 to 120, and terrestrial reference channel B scope is 1 to 150.
More specifically, in described Autonomous Landing of UAV system: described aerial camera is ultra high-definition
Aerial camera, the resolution of captured doubtful landmark image is 3840 × 2160.
More specifically, in described Autonomous Landing of UAV system: described Image semantic classification subset,
Described doubtful terrestrial reference segmentation subset, described terrestrial reference identification subset set with described relative position detection
Realize for being respectively adopted different fpga chips.
More specifically, in described Autonomous Landing of UAV system: by described Image semantic classification subset,
Described doubtful terrestrial reference segmentation subset, described terrestrial reference identification subset set with described relative position detection
For being integrated on one piece of surface-mounted integrated circuit.
More specifically, in described Autonomous Landing of UAV system: described unmanned plane is rotor wing unmanned aerial vehicle.
Accompanying drawing explanation
Below with reference to accompanying drawing, embodiment of the present invention are described, wherein:
Fig. 1 is the structure square frame according to the Autonomous Landing of UAV system shown in embodiment of the present invention
Figure.
Fig. 2 is the terrestrial reference localizer according to the Autonomous Landing of UAV system shown in embodiment of the present invention
Block diagram.
Detailed description of the invention
Below with reference to accompanying drawings the embodiment of the Autonomous Landing of UAV system of the present invention is carried out in detail
Describe in detail bright.
Unmanned plane, i.e. UAV, its english abbreviation is " UAV ", is to utilize radio distant
The most manned aircraft that control equipment and the presetting apparatus provided for oneself are handled.Can divide from technical standpoint definition
For: depopulated helicopter, unmanned fixed-wing aircraft, unmanned multi-rotor aerocraft, unmanned airship, unmanned umbrella
These several big classes of wing machine.In terms of purposes, classification can be divided into military unmanned air vehicle and civilian unmanned plane.Military side
Face, can be used for battle reconnaissance and supervision, school, location are penetrated, injured assessment, electronic warfare, and civilian
Aspect, can be used for border patrol, nuclear radiation detection, aeroplane photography, mineral exploration aviation, the condition of a disaster monitor,
Traffic patrolling and security monitoring.
Unmanned plane is in the application of military aspect without polylogia, and at present, unmanned plane is in the application of civilian aspect
More and more extensive.Due to the particularity of UAV Flight Control, often easily fly when taking off
Row controls, and flies to specific landing point, such as landing field cursor position due to needs when landing is i.e. reclaimed,
Thus higher to the requirement controlled of flying, the difficulty that flight controls is bigger, how at UAV Landing
Time reduce manual operation, it is achieved high-precision full-automatic under various landform lands, and is each unmanned plane
One of company's urgent problem.
UAV Landing mode of the prior art cannot accomplish high accuracy and low overhead between effective
Equilibrium, and the Autonomous Landing of UAV under all landform cannot be realized, its intelligent level is the highest.
To this end, the present invention has built a kind of Autonomous Landing of UAV system, use various image targetedly
Processing equipment improves the precision of demarcation position, landing field, meanwhile, landform identification technology and wireless communication technology
The landing requirement that meet different terrain under is used in combination.
Fig. 1 is the structure square frame according to the Autonomous Landing of UAV system shown in embodiment of the present invention
Figure, described system includes that unmanned plane drives equipment 1, master controller 2, aerial camera 3 and terrestrial reference
Localizer 4, described aerial camera 3 is connected with described terrestrial reference localizer 4, described master controller 2
Equipment 1, described aerial camera 3 and described terrestrial reference localizer 4 is driven to connect respectively with described unmanned plane
Connect.
Wherein, the doubtful landmark region under machine is shot to obtain doubtful by described aerial camera 3
Landmark image, described terrestrial reference localizer 4 carries out image procossing to determine to described doubtful landmark image
The relative altitude of unmanned plane distance terrestrial reference is obtained with relative when described doubtful landmark image exists terrestrial reference
Orientation distance, described master controller 2 is based on described relative altitude and described relative localization distance controlling institute
State unmanned plane drive equipment 1 with drive described UAV Landing to put on.
Then, continue the concrete structure of the Autonomous Landing of UAV system of the present invention is carried out further
Explanation.
Described system also includes: landform evaluator, is connected with described aerial camera 3, for based on
Described doubtful landmark image determines the type of landform under machine, under described machine the type of landform include Plain,
Meadow, mountain area, desert, naval vessels, Gobi desert, city and hills.
Described system also includes: wireless transmitting-receiving equipments, sets up two-way with the unmanned aerial vehicle (UAV) control platform of far-end
Wireless communication link, be connected respectively with landform evaluator and storage device, by landform under described machine
Type is sent to described unmanned aerial vehicle (UAV) control platform, with receive described unmanned aerial vehicle (UAV) control platform return and
The terrestrial reference segmentation color data that under described machine, the type of landform is corresponding, and described terrestrial reference is split number of colours
According in storage to storage device, described terrestrial reference segmentation color data includes terrestrial reference R channel range, terrestrial reference
G channel range and terrestrial reference channel B scope, described terrestrial reference R channel range, described terrestrial reference G passage
Scope and described terrestrial reference channel B scope are for dividing the terrestrial reference in RGB image with RGB image background
From.
Described system also includes: GPS locator, is connected with GPS navigation satellite, is used for receiving nobody
The real time positioning data of machine position.
Described system also includes: storage device, is used for prestoring preset height scope, preset pressure
Elevation weight and default ultrasonic height weight, be additionally operable to prestore the benchmark of the terrestrial reference of various species
The affine invariant moment features of the terrestrial reference of image template and various species, the benchmark of the terrestrial reference of each kind
Image template by the benchmark terrestrial reference of each kind is shot obtained pattern in advance, each kind
The affine invariant moment features of terrestrial reference extract from the benchmark image template of terrestrial reference of each kind.
Described system also includes: highly sensing equipment, is connected with described storage device, high including air pressure
Degree sensor, ultrasonic height sensors and microcontroller;Described pressure-altitude sensor is used for basis
Air pressure change near unmanned plane, the real-time pressure altitude of detection unmanned plane position;Described ultrasonic
Wave height sensor includes ultrasound wave emitter, ultrasonic receiver and single-chip microcomputer, described single-chip microcomputer with
Described ultrasound wave emitter and described ultrasonic receiver connect respectively, and described ultrasound wave emitter is to ground
Surface launching ultrasound wave, described ultrasonic receiver receives the ultrasound wave of ground return, described single-chip microcomputer root
According to the launch time of described ultrasound wave emitter, the reception time of described ultrasonic receiver and ultrasound wave
Spread speed calculates the real-time ultrasound wave height of unmanned plane;Described microcontroller passes with described pressure altitude
Sensor, described ultrasonic height sensors and described storage device connect respectively, when described real-time air pressure
The difference of height and described real-time ultrasound wave height is when described preset height scope, based on described default gas
Pressure elevation weight, described default ultrasonic height weight, described real-time pressure altitude and described in real time super
Sonic level calculates and exports described real-time height, when described real-time pressure altitude and described real-time ultrasound
The difference of wave height, not when described preset height scope, exports height detection failure signal.
Described aerial camera 3 is linear array digital aviation camera, including undercarriage having shock absorption function, front cover glass,
Camera lens, filter and image-forming electron unit, doubt to obtain for shooting described doubtful landmark region
As logo image.
As in figure 2 it is shown, described terrestrial reference localizer 4 and described aerial camera 3, described storage device,
Described GPS locator and described highly sensing equipment connect respectively, and described terrestrial reference localizer 4 includes figure
As pretreatment subset 41, doubtful terrestrial reference segmentation subset 42, terrestrial reference identification subset 43 are with relative
Position detection subset 44.
Described Image semantic classification subset 41 carries out contrast enhancing successively to described doubtful landmark image
Process, medium filtering processes and rgb color space conversion process, to obtain doubtful terrestrial reference RGB figure
Picture.
Described doubtful terrestrial reference segmentation subset 42 and described Image semantic classification subset 41 and described storage
Equipment connects respectively, calculates the R channel value of each pixel, G in described doubtful terrestrial reference RGB image
Channel value and channel B value, when the R channel value of a certain pixel is in described terrestrial reference R channel range, G
Channel value in described terrestrial reference G channel range and channel B value in the range of described terrestrial reference channel B time,
It is defined as doubtful terrestrial reference pixel, by doubtful terrestrial reference pictures all in described doubtful terrestrial reference RGB image
Element combination is to form doubtful terrestrial reference sub pattern.
Described terrestrial reference identification subset 43 sets with described doubtful terrestrial reference segmentation subset 42 and described storage
Back-up does not connect, and calculates the affine invariant moment features of described doubtful terrestrial reference sub pattern, by described doubtful
Mark sub pattern is mated one by one with the benchmark image template of the terrestrial reference of various species, and it fails to match then exports nothing
Terrestrial reference signal, the match is successful then obtain coupling terrestrial reference type and by described storage device with mate
Affine invariant moment features and the affine not bending moment of described doubtful terrestrial reference sub pattern that terrestrial reference type is corresponding are special
Levying and compare, difference then exports without terrestrial reference signal, identical, and output exists the terrestrial reference of terrestrial reference signal, coupling
Type and the image phase para-position in described doubtful terrestrial reference RGB image of the described doubtful terrestrial reference sub pattern
Put.
Described relative position detection subset 44 is determined with described terrestrial reference identification subset 43, described GPS
Position device and described highly sensing equipment connect respectively, receive there is terrestrial reference signal time, based on described
Image calculates the relative localization of described unmanned plane distance terrestrial reference relative to position and described real time positioning data
Distance, and using described real-time height as described unmanned plane distance terrestrial reference relative altitude.
Described master controller 2 and described terrestrial reference localizer 4, described highly sensing equipment and described unmanned
Machine drives equipment 1 to connect respectively, is receiving described height detection failure signal or described without terrestrial reference letter
Number time, by described height detection failure signal or described signal without terrestrial reference by described wireless transmitting-receiving equipments
It is transmitted to described unmanned aerial vehicle (UAV) control platform;Described master controller 2 is receiving described relative localization distance
And when receiving described relative altitude, control institute based on described relative localization distance and described relative altitude
State unmanned plane drive equipment 1 with drive described UAV Landing to put on.
Wherein, in the system: when under described machine, the type of landform is Plain, under described machine
The terrestrial reference that the type of shape is corresponding is split in color data, and terrestrial reference R channel range is 150 to 255, ground
Mark G channel range is 0 to 120, and terrestrial reference channel B scope is 1 to 150;Described aerial camera
3 is ultra high-definition aerial camera, and the resolution of captured doubtful landmark image is 3840 × 2160;
Described Image semantic classification subset 41, described doubtful terrestrial reference segmentation subset 42, described terrestrial reference identification
Subset 43 is respectively adopted different fpga chips with described relative position detection subset 44 and comes real
Existing, alternatively, by described Image semantic classification subset 41, described doubtful terrestrial reference segmentation subset 42,
Described terrestrial reference identification subset 43 is integrated in one block of integrated electricity with described relative position detection subset 44
On the plate of road, and alternatively, described unmanned plane is rotor wing unmanned aerial vehicle.
It addition, median filter is a kind of nonlinear digital filter technology, it is frequently used for removing image
Or the noise in other signal.The design philosophy of median filter is exactly to check adopting in input signal
Sample also judges whether it represents signal, uses the observation window of odd-numbered samples composition to realize this merit
Energy.Numerical value in watch window is ranked up, and is positioned at the intermediate value in the middle of observation window as output, then,
Abandon value the earliest, obtain new sampling, repeat calculating process above.
In image procossing, carrying out such as rim detection before such further process, it usually needs
First a certain degree of noise reduction is carried out.Medium filtering is a general procedure in image procossing, and it is right
It is particularly useful for speckle noise and salt-pepper noise.The characteristic preserving edge makes it be not intended to appearance
Ill-defined occasion is the most very useful.
It addition, described storage device can type selecting be SDRAM, i.e. Synchronous Dynamic
Random Access Memory, synchronous DRAM, synchronize to refer to that Memory work needs
Wanting synchronised clock, the transmission of internal order and the transmission of data are all on the basis of it;Dynamically refer to deposit
Storage array needs constantly to refresh to ensure that data are not lost;Refer to that data are not linearly to deposit successively at random
Storage, but freely specify address to carry out reading and writing data.
SDRAM memory is from developing into have gone through now four generations, respectively: first generation SDR
SDRAM, second filial generation DDR SDRAM, third generation DDR2SDRAM, forth generation DDR3
SDRAM.First generation SDRAM employing single-ended (Single-Ended) clock signal, the second filial generation,
The third generation and forth generation are owing to operating frequency is than very fast, so using the differential clocks letter that can reduce interference
Number as synchronised clock.The clock frequency of SDR SDRAM is exactly the frequency of data storage, the first generation
Internal memory clock frequency is named, as pc100, pc133 then show that clock signal is 100 or 133MHz,
Reading and writing data speed is also 100 or 133MHz.Second afterwards, three, four generation DDR (Double
Data Rate) internal memory then use reading and writing data speed as naming standard, and above plus represent
The symbol of its DDR algebraically, PC1=DDR, PC2=DDR2, PC3=DDR3.Such as PC2700
Being DDR333, its operating frequency is 333/2=166MHz, and 2700 represent a width of 2.7G of band.DDR
Read-write frequency from DDR200 to DDR400, DDR2 from DDR2-400 to DDR2-800,
DDR3 is from DDR3-800 to DDR3-1600.
Use the Autonomous Landing of UAV system of the present invention, cannot hold concurrently for existing UAV Landing technology
Turn round and look at the technical problem that cost performance is the highest with high accuracy and intelligent level, by using terrestrial reference template
Join the mode that combines of coupling of the affine invariant moment features with terrestrial reference pattern and meet high accuracy and high property
The requirement of valency ratio, is used by the coordination of landform identification technology and wireless communication technology and achieves variously
Under shape, the search of various landing fields target, is finally based on the relative position of the ground subject distance unmanned plane searched
Unmanned plane is driven automatically to fly to expection landing point.
Although it is understood that the present invention discloses as above with preferred embodiment, but above-mentioned enforcement
Example is not limited to the present invention.For any those of ordinary skill in the art, without departing from
Under technical solution of the present invention ambit, all may utilize the technology contents of the disclosure above to the technology of the present invention
Scheme makes many possible variations and modification, or is revised as the Equivalent embodiments of equivalent variations.Therefore,
Every content without departing from technical solution of the present invention, the technical spirit of the foundation present invention is to above example
Any simple modification, equivalent variations and the modification done, all still falls within technical solution of the present invention protection
In the range of.
Claims (2)
1. an Autonomous Landing of UAV system, it is characterised in that described system includes shooting of taking photo by plane
Machine, terrestrial reference localizer, unmanned plane drive equipment and master controller, and described aerial camera is under machine
Doubtful landmark region carries out shooting to obtain doubtful landmark image, and described terrestrial reference localizer is to described doubtful
Landmark image obtains nothing when carrying out image procossing there is terrestrial reference in determining described doubtful landmark image
The relative altitude of man-machine distance terrestrial reference and relative localization distance, described master controller positions with described terrestrial reference
Device and described unmanned plane drive equipment to connect respectively, based on described relative altitude and described relative localization away from
From control described unmanned plane drive equipment with drive described UAV Landing to put on.
2. Autonomous Landing of UAV system as claimed in claim 1, it is characterised in that described system
System also includes:
Landform evaluator, is connected with described aerial camera, for true based on described doubtful landmark image
Determine the type of landform under machine, under described machine the type of landform include Plain, meadow, mountain area, desert,
Naval vessels, Gobi desert, city and hills;
Wireless transmitting-receiving equipments, sets up two-way wireless communication link with the unmanned aerial vehicle (UAV) control platform of far-end,
It is connected respectively with landform evaluator and storage device, the type of landform under described machine is sent to described nothing
Human-machine Control platform, to receive the class of landform under that described unmanned aerial vehicle (UAV) control platform returns and described machine
The terrestrial reference segmentation color data that type is corresponding, and described terrestrial reference segmentation color data is stored storage device
In, described terrestrial reference segmentation color data includes terrestrial reference R channel range, terrestrial reference G channel range and terrestrial reference
Channel B scope, described terrestrial reference R channel range, described terrestrial reference G channel range and described terrestrial reference B
Channel range is for by the terrestrial reference in RGB image and RGB image background separation;
GPS locator, is connected with GPS navigation satellite, for receiving the real-time of unmanned plane position
Location data;
Storage device, is used for prestoring preset height scope, preset pressure elevation weight and presetting super
Sonic level weight, is additionally operable to prestore the benchmark image template of the terrestrial reference of various species and each kind
The affine invariant moment features of the terrestrial reference of class, the benchmark image template of the terrestrial reference of each kind is to each
The benchmark terrestrial reference of individual kind shoots obtained pattern in advance, the terrestrial reference of each kind affine constant
Moment Feature Extraction is from the benchmark image template of the terrestrial reference of each kind;
Highly sensing equipment, is connected with described storage device, including pressure-altitude sensor, ultrasound wave
Height sensor and microcontroller;Described pressure-altitude sensor is for according to the air pressure near unmanned plane
Change, the real-time pressure altitude of detection unmanned plane position;Described ultrasonic height sensors includes
Ultrasound wave emitter, ultrasonic receiver and single-chip microcomputer, described single-chip microcomputer and described ultrasound wave emitter
Connecting respectively with described ultrasonic receiver, described ultrasound wave emitter launches ultrasound wave, institute earthward
Stating ultrasonic receiver and receive the ultrasound wave of ground return, described single-chip microcomputer is according to described ultrasonic emitting
The launch time of machine, the reception time of described ultrasonic receiver and ultrasonic propagation velocity calculate unmanned
The real-time ultrasound wave height of machine;Described microcontroller and described pressure-altitude sensor, described ultrasound wave
Height sensor and described storage device connect respectively, when described real-time pressure altitude and described in real time super
The difference of sonic level is when described preset height scope, based on described preset pressure elevation weight, described
Preset ultrasonic height weight, described real-time pressure altitude and described real-time ultrasound wave height to calculate and defeated
Go out described real-time height, when the difference of described real-time pressure altitude and described real-time ultrasound wave height is not in institute
When stating preset height scope, export height detection failure signal;
Described aerial camera is linear array digital aviation camera, including undercarriage having shock absorption function, front cover glass,
Camera lens, filter and image-forming electron unit, doubt to obtain for shooting described doubtful landmark region
As logo image;
Described terrestrial reference localizer and described aerial camera, described storage device, described GPS locator
Connect respectively with described highly sensing equipment, including Image semantic classification subset, doubtful terrestrial reference segmentation
Equipment, terrestrial reference identification subset detect subset with relative position;Described Image semantic classification subset pair
Described doubtful landmark image carries out contrast enhancement processing successively, medium filtering processes and rgb color
Space conversion process, to obtain doubtful terrestrial reference RGB image;Described doubtful terrestrial reference segmentation subset with
Described Image semantic classification subset and described storage device connect respectively, calculate described doubtful terrestrial reference RGB
The R channel value of each pixel, G channel value and channel B value in image, as the R of a certain pixel
Channel value in described terrestrial reference R channel range, G channel value is in described terrestrial reference G channel range and B
When channel value is in the range of described terrestrial reference channel B, it is defined as doubtful terrestrial reference pixel, is doubted described
As mark in RGB image all doubtful terrestrial reference combination of pixels to form doubtful terrestrial reference sub pattern;Described
Terrestrial reference identification subset is connected respectively with described doubtful terrestrial reference segmentation subset and described storage device, meter
Calculate the affine invariant moment features of described doubtful terrestrial reference sub pattern, by described doubtful terrestrial reference sub pattern and each
The benchmark image template of the terrestrial reference of kind is mated one by one, and it fails to match then exports without terrestrial reference signal, coupling
Successful then obtain coupling terrestrial reference type and by described storage device with coupling terrestrial reference type corresponding
Affine invariant moment features compare with the affine invariant moment features of described doubtful terrestrial reference sub pattern, difference is then
Output is without terrestrial reference signal, identical, and output exists terrestrial reference signal, the terrestrial reference type of coupling and described doubtful
As mark sub pattern image in described doubtful terrestrial reference RGB image relative to position;Described phase para-position
Put detection subset to set with described terrestrial reference identification subset, described GPS locator and described highly sensing
Back-up does not connect, receive there is terrestrial reference signal time, based on described image relative to position and described reality
Shi Dingwei data calculate the relative localization distance of described unmanned plane distance terrestrial reference, and by described real-time height
Spend the relative altitude as described unmanned plane distance terrestrial reference;
Described master controller drives with described terrestrial reference localizer, described highly sensing equipment and described unmanned plane
Dynamic equipment connects respectively, when receiving described height detection failure signal or described signal without terrestrial reference,
Described height detection failure signal or described signal without terrestrial reference are forwarded by described wireless transmitting-receiving equipments
To described unmanned aerial vehicle (UAV) control platform;Described master controller is receiving described relative localization distance and is receiving
During to described relative altitude, control described unmanned based on described relative localization distance and described relative altitude
Machine drive equipment with drive described UAV Landing to put on;
When under described machine, the type of landform is Plain, the terrestrial reference that under described machine, the type of landform is corresponding divides
Cutting in color data, terrestrial reference R channel range is 150 to 255, and terrestrial reference G channel range is 0 to 120,
Terrestrial reference channel B scope is 1 to 150;
Described Image semantic classification subset, described doubtful terrestrial reference segmentation subset, described terrestrial reference identification
Equipment is respectively adopted different fpga chips with described relative position detection subset and realizes;
Described unmanned plane is rotor wing unmanned aerial vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610263384.5A CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610263384.5A CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510104994.6A CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510104994.6A Division CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105955289A true CN105955289A (en) | 2016-09-21 |
Family
ID=53314240
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510104994.6A Pending CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510574657.3A Expired - Fee Related CN105182995B (en) | 2015-03-10 | 2015-03-10 | Autonomous Landing of UAV system |
CN201610263384.5A Withdrawn CN105955289A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510574773.5A Pending CN105068553A (en) | 2015-03-10 | 2015-03-10 | Unmanned aerial vehicle automatic landing system |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510104994.6A Pending CN104679013A (en) | 2015-03-10 | 2015-03-10 | Unmanned plane automatic landing system |
CN201510574657.3A Expired - Fee Related CN105182995B (en) | 2015-03-10 | 2015-03-10 | Autonomous Landing of UAV system |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510574773.5A Pending CN105068553A (en) | 2015-03-10 | 2015-03-10 | Unmanned aerial vehicle automatic landing system |
Country Status (1)
Country | Link |
---|---|
CN (4) | CN104679013A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018069961A (en) * | 2016-10-31 | 2018-05-10 | 株式会社エンルートM’s | Device arrangement device, device arrangement method, and device arrangement program |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10358214B2 (en) | 2015-01-04 | 2019-07-23 | Hangzhou Zero Zro Technology Co., Ltd. | Aerial vehicle and method of operation |
US9836053B2 (en) | 2015-01-04 | 2017-12-05 | Zero Zero Robotics Inc. | System and method for automated aerial system operation |
US10220954B2 (en) | 2015-01-04 | 2019-03-05 | Zero Zero Robotics Inc | Aerial system thermal control system and method |
US10719080B2 (en) | 2015-01-04 | 2020-07-21 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system and detachable housing |
US10126745B2 (en) | 2015-01-04 | 2018-11-13 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
CN105676875A (en) * | 2015-03-10 | 2016-06-15 | 张超 | Automatic landing system of unmanned aerial vehicle |
FR3038991B1 (en) * | 2015-07-16 | 2018-08-17 | Safran Electronics & Defense | AUTOMATIC ASSISTANCE METHOD FOR LANDING AN AIRCRAFT |
CN205120973U (en) * | 2015-11-14 | 2016-03-30 | 深圳市易特科信息技术有限公司 | A unmanned aerial vehicle system for fixing a position nuclear radiation radiation source |
CN105416515B (en) * | 2015-12-07 | 2018-01-12 | 武汉理工大学 | A kind of four rotor wing unmanned aerial vehicle carrying devices |
EP3387506B1 (en) | 2015-12-09 | 2021-01-13 | SZ DJI Technology Co., Ltd. | Systems and methods for auto-return |
KR102220394B1 (en) * | 2015-12-29 | 2021-02-24 | 항저우 제로 제로 테크놀로지 컴퍼니 리미티드 | System and method for automatic aviation system operation |
CN105527973A (en) * | 2016-01-15 | 2016-04-27 | 无锡觅睿恪科技有限公司 | Unmanned aerial vehicle automatic landing system |
CN113342038B (en) * | 2016-02-29 | 2024-08-20 | 星克跃尔株式会社 | Method and system for generating map for unmanned aerial vehicle flight |
US9454154B1 (en) * | 2016-03-07 | 2016-09-27 | Amazon Technologies, Inc. | Incident light sensor on autonomous vehicle |
CN105629996A (en) * | 2016-03-22 | 2016-06-01 | 昆明天龙经纬电子科技有限公司 | Unmanned aerial vehicle fixed-point landing guiding method and system |
JP2017185758A (en) * | 2016-04-08 | 2017-10-12 | 東芝テック株式会社 | Printing device and printing method |
WO2017187275A2 (en) | 2016-04-24 | 2017-11-02 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
CN106153008B (en) * | 2016-06-17 | 2018-04-06 | 北京理工大学 | A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model |
CN114476105A (en) * | 2016-08-06 | 2022-05-13 | 深圳市大疆创新科技有限公司 | Automated landing surface topography assessment and related systems and methods |
WO2018053861A1 (en) | 2016-09-26 | 2018-03-29 | SZ DJI Technology Co., Ltd. | Methods and system for vision-based landing |
CN107478244A (en) * | 2017-06-23 | 2017-12-15 | 中国民航大学 | The unmanned plane check system and method for a kind of instrument-landing-system |
CN107399440A (en) * | 2017-07-27 | 2017-11-28 | 北京航空航天大学 | Aircraft lands method and servicing unit |
CN108594848B (en) * | 2018-03-29 | 2021-01-22 | 上海交通大学 | Unmanned aerial vehicle staged autonomous landing method based on visual information fusion |
CN110383196B (en) * | 2018-07-02 | 2023-07-11 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle return control method and device and unmanned aerial vehicle |
CN108983811A (en) * | 2018-07-23 | 2018-12-11 | 南京森林警察学院 | A kind of dual-purpose unmanned plane device based on FPGA |
CN110322462B (en) * | 2019-06-13 | 2021-07-27 | 暨南大学 | Unmanned aerial vehicle visual landing method and system based on 5G network |
US11829162B2 (en) | 2019-08-15 | 2023-11-28 | Teledyne Flir Detection, Inc. | Unmanned aerial vehicle locking landing pad |
CN110989687B (en) * | 2019-11-08 | 2021-08-10 | 上海交通大学 | Unmanned aerial vehicle landing method based on nested square visual information |
US11767110B2 (en) | 2019-12-16 | 2023-09-26 | FLIR Unmanned Aerial Systems AS | System for storing, autonomously launching and landing unmanned aerial vehicles |
CN111413708A (en) * | 2020-04-10 | 2020-07-14 | 湖南云顶智能科技有限公司 | Unmanned aerial vehicle autonomous landing site selection method based on laser radar |
CN113504794B (en) * | 2021-07-23 | 2023-04-21 | 中国科学院地理科学与资源研究所 | Unmanned aerial vehicle cluster reconstruction method, unmanned aerial vehicle cluster reconstruction system and electronic equipment |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100567898C (en) * | 2007-01-16 | 2009-12-09 | 北京航空航天大学 | Pilotless plane landing navigation method and device thereof |
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
CN102156480A (en) * | 2010-12-30 | 2011-08-17 | 清华大学 | Unmanned helicopter independent landing method based on natural landmark and vision navigation |
CN102538782B (en) * | 2012-01-04 | 2014-08-27 | 浙江大学 | Helicopter landing guide device and method based on computer vision |
CN102854882B (en) * | 2012-09-21 | 2015-07-15 | 苏州工业园区职业技术学院 | Automatic control system of three-wing two-paddle recombination type unmanned aerial vehicle (UAV) |
CN103963954B (en) * | 2013-01-28 | 2018-09-21 | 上海科斗电子科技有限公司 | Thermal energy regulates and controls levitation device |
CN103226356A (en) * | 2013-02-27 | 2013-07-31 | 广东工业大学 | Image-processing-based unmanned plane accurate position landing method |
CN103809598B (en) * | 2014-03-12 | 2016-08-10 | 北京航空航天大学 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
CN104166854B (en) * | 2014-08-03 | 2016-06-01 | 浙江大学 | For the visual rating scale terrestrial reference positioning identifying method of miniature self-service machine Autonomous landing |
CN204406209U (en) * | 2015-03-10 | 2015-06-17 | 无锡桑尼安科技有限公司 | Autonomous Landing of UAV system |
-
2015
- 2015-03-10 CN CN201510104994.6A patent/CN104679013A/en active Pending
- 2015-03-10 CN CN201510574657.3A patent/CN105182995B/en not_active Expired - Fee Related
- 2015-03-10 CN CN201610263384.5A patent/CN105955289A/en not_active Withdrawn
- 2015-03-10 CN CN201510574773.5A patent/CN105068553A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018069961A (en) * | 2016-10-31 | 2018-05-10 | 株式会社エンルートM’s | Device arrangement device, device arrangement method, and device arrangement program |
Also Published As
Publication number | Publication date |
---|---|
CN105182995A (en) | 2015-12-23 |
CN105182995B (en) | 2016-09-07 |
CN104679013A (en) | 2015-06-03 |
CN105068553A (en) | 2015-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105182995B (en) | Autonomous Landing of UAV system | |
CN204406209U (en) | Autonomous Landing of UAV system | |
Giordan et al. | The use of unmanned aerial vehicles (UAVs) for engineering geology applications | |
CN105865454B (en) | A kind of Navigation of Pilotless Aircraft method generated based on real-time online map | |
Rinaudo et al. | Archaeological site monitoring: UAV photogrammetry can be an answer | |
Küng et al. | The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery | |
US9903719B2 (en) | System and method for advanced navigation | |
CN103822635A (en) | Visual information based real-time calculation method of spatial position of flying unmanned aircraft | |
CN103852077B (en) | Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process | |
CN113625774B (en) | Local map matching and end-to-end ranging multi-unmanned aerial vehicle co-location system and method | |
CN101126639A (en) | Quick low altitude remote sensing image automatic matching and airborne triangulation method | |
CN104820434A (en) | Velocity measuring method of ground motion object by use of unmanned plane | |
CN102937443A (en) | Target rapid positioning system and target rapid positioning method based on unmanned aerial vehicle | |
CN109341543A (en) | A kind of height calculation method of view-based access control model image | |
CN105847753A (en) | Transmission device identification platform located on unmanned aerial vehicle | |
Williams et al. | All-source navigation for enhancing UAV operations in GPS-denied environments | |
Kong et al. | A ground-based multi-sensor system for autonomous landing of a fixed wing UAV | |
Schleiss et al. | VPAIR--Aerial Visual Place Recognition and Localization in Large-scale Outdoor Environments | |
CN105208346B (en) | Transmission facility identification method based on unmanned plane | |
CN106018402B (en) | A kind of the visibility detection system and method for UAV system refractive and reflective panorama stereoscopic camera | |
Starek et al. | Application of unmanned aircraft systems for coastal mapping and resiliency | |
CN105676875A (en) | Automatic landing system of unmanned aerial vehicle | |
CN105242248A (en) | Radar captive carrying test position parameter automatic binding method based on measurement and control equipment | |
Sharma et al. | Integrating the UAS in Undergraduate Teaching and Research–Opportunities and Challenges at University of North Georgia | |
Panigrahi et al. | Design criteria of a UAV for ISTAR and remote sensing applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20160921 |