CN105000194A - UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark - Google Patents

UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark Download PDF

Info

Publication number
CN105000194A
CN105000194A CN201510496861.8A CN201510496861A CN105000194A CN 105000194 A CN105000194 A CN 105000194A CN 201510496861 A CN201510496861 A CN 201510496861A CN 105000194 A CN105000194 A CN 105000194A
Authority
CN
China
Prior art keywords
circular luminous
luminary
led
ground
luminous body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510496861.8A
Other languages
Chinese (zh)
Inventor
史彩成
解乃林
史其存
李秀珍
辛怡
李勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201510496861.8A priority Critical patent/CN105000194A/en
Publication of CN105000194A publication Critical patent/CN105000194A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on a ground cooperative mark and belongs to the field of flight navigation control. At least four round luminophors are arranged into the directional ground cooperative mark, and a reserved landing position is set in front of a peak lamp. A UAV hovers at a preset height, each highlight area in each frame of image of an airborne camera is detected, whether the highlight area is one round luminophor or not is judged, then characteristics are compared with ground cooperative mark characteristics stored in a target base in advance and recognized according to the number of the round luminophors and the arrangement relation among the round luminophors, the position of the peak lamp and the preset landing position are acquired, and the UAV is controlled to descend and fly towards the preset landing position and performs target prediction and tracking simultaneously. The ground cooperative mark has less arrangement constraint and high flexibility, various marks are supported for guidance of different landing positions, the UAV assisted landing guidance efficiency and reliability are effectively improved, and GPS (global positioning system) limitation is avoided.

Description

Visual guide method and mobile system fall in the unmanned machine aided based on ground cooperation mark
Technical field
The present invention relates to a kind of unmanned machine aided and fall visual guide method and mobile system, relate to the unmanned plane visual guide method and mobile system with ground cooperation marking point particularly.Belong to flight navigation control field.
Background technology
Unmanned plane (Unmanned aerial vehicle) is compared with manned aircraft, there is the features such as lightweight, personal casualty risk is little, manoevreability good, cabin design is simple, except application prospect militarily, it also shows one's promises gradually at civil area.Navigation of Pilotless Aircraft technology mainly comprises inertial navigation, radar navigation, GPS navigation, microwave aerospace navigation and vision guided navigation etc.Wherein, along with the development of computer vision and acquisition technology, optical measuring technique, high rate bioreactor and memory technology, the airmanship of view-based access control model information becomes study hotspot in recent years, and have that equipment is simple, low in energy consumption, volume is little, the autonomous advantage such as passive, and do not rely on ground and aerial navigational aid, especially GPS restriction (GPS navigation system utilizes navigation satellite to carry out navigator fix, have that signal is easily disturbed, the problem such as resolving accuracy and technical monopoly) is not limited by.In electronic countermeasure, there is greater advantage.
Landing navigation is the important stage in unmanned plane during flying, and because approach is by the impact of the factors such as flying height, meteorology and geographical environment, accurately reliably helping of automation is fallen method and become one of crucial research contents in unmanned air vehicle technique.According to the difference of camera installation locations, vision-aided landing system can be divided into: the UAV Landing navigationsystem based on foundation information, the airborne land navigation system based on artificial target's thing and the airborne land navigation system based on natural scene.Ground helps the system of falling by the picture pick-up device sync pulse jamming unmanned plane image of the ground number of demarcation, extract minutiae or cooperation mark, calculating location and attitude also feed back to control system and control to land, as unmanned plane independent landing guidance device and the guidance method (application number 201410436145.6) of the view-based access control model of people's propositions such as Northwestern Polytechnical University Zhang Yanning, measure camera arrangements in runway both sides, high light marker lamp is arranged in unmanned plane dead ahead, solve flight parameter by measuring no-manned plane three-dimensional spatial positional information and be wirelessly sent to system for flight control computer guiding independent landing, these class methods, higher and method of calculating complexity also needs ground and unmanned plane to carry out radio communication to ground job requirement, make immunity to countermeasures and fly control reliability to reduce.Airborne land navigation system such as Tsing-Hua University crown woods and Zhu Jihong based on natural scene propose a kind of depopulated helicopter independent landing concept (application number 201010623599.6) based on natural terrestrial reference and vision guided navigation, utilize ground nature terrestrial reference lock onto target to control unmanned plane to hover over this and naturally put on and empty anyly carry out dropping to landing under altimeter guides, how the method also undeclaredly differentiates target feature naturally, and be vertical decline, natural terrestrial reference surrounding environment is not limited in the text, the safety in landing site cannot be ensured.The airborne image helping the system of falling to obtain ground control mark by being arranged on pick up camera on unmanned plane, estimates measuring and calculating unmanned plane during flying state by pose and relative to the position of landing point and orientation, implements to help to fall guiding through control system.
Airbornely help the ground control mark in the system of falling mostly to adopt ground signal cloth, namely vision guided navigation airborne help the system of falling usually need ground realize laying there is special shape landing indicator signal cloth as ground cooperation mark.Depopulated helicopter, by the feature of the image of identification signal cloth, revises self horizontal displacement and course.The research of domestic and international related research institutes in vision-aided landing concentrate on select landing indicator signal cloth to comprise ground cooperation character shape, utilize the information such as visual pattern geometric parameter to resolve unmanned plane position and attitude, obtain navigator fix parameter.Cooperative target detection algorithm should have higher real-time and noiseproof feature, design cooperative target require this mark to comprise relative position that enough information makes computer vision system can utilize these needed for information extraction unmanned plane independent landing and attitude information, and this mark should easily identify, easily and other objects and environment distinguish, certainly also wish that pattern is fairly simple, to promote the processing speed of vision guide system simultaneously.
Existing landing ramp ground cooperation mark is mostly the polygon combination of regular shape, H type, L-type, T-shaped or circular, and its common ground is all designed to black and white with distinct contrast, so that can terrestrial reference feature extraction out in the binary conversion treatment of image.Light, environment and camera quality etc. all can cause change in various degree to the gray value of the corresponding in the picture pixel of black and white mark, and black part gray value is reduced, and the rising of white portion gray value, the two difference reduces.Current existing cooperation mark visual guide method comprises the method for inspection based on Corner Feature, and this type of mark adopts rectangle and Straight Combination element mostly, but the dependence of edge extraction algorithm is larger; Based on the method for inspection of circle feature, wherein Zhang Yuanmin etc. relate to the characteristic pattern of six circle compositions, the leg-of-mutton summit of its ectomesoderm is that annulus is used to refer to the orientation falling district, and by border circular areas method and template matching method identification circle, but effect does not adopt the method for angle point accurate.Then also have the method for inspection based on angle point and the mixing of circle feature, but add operand undoubtedly and have impact on real-time.In addition the object detection method based on not bending moment is also had, as " H " shape cooperation mark that American South University of California proposes, and the "T"-shaped cooperation mark that Nanjing Aero-Space University Xu Gui power etc. proposes, although accuracy reliability has certain calculating relative complex promoting not bending moment.Other algorithms also comprise the machine learning method etc. of template matches detection algorithm, Corpus--based Method classification, are all limited by operand and system stability.Some cooperation mark is coated with infrared coating, as cooperative target design and the localization method (application number 201310125318.8) of a kind of unmanned plane autonomous landing on the ship of people's propositions such as BJ University of Aeronautics & Astronautics Lee ruddiness, the "T"-shaped cooperation mark that three scribble red infrared coating is set in runway both sides, naval vessel, and "T"-shaped two certain ratio is deferred on limit, adopt Airborne IR pick up camera shooting ground cooperation sign image, and carry out feature extraction and location, also to determine the state of kinematic motion on naval vessel simultaneously, determine that unmanned plane the exact location of warship.The and for example cooperative target recognition methods (ZL200510095085.7) of unmanned plane independent landing that proposes of the people such as Nanjing Aero-Space University Xu Gui power, the directional chain-code of based target profile is utilized to extract the aspect ratio of criss-cross ground cooperative target, by target shape feature determination unmanned plane independent landing direction.
In addition, also just like the autonomous landing system of a kind of rotor wing unmanned aerial vehicle based on three layers of isosceles triangle polychrome landing ramp (application number 201410089860.7) that the people such as BJ University of Aeronautics & Astronautics's thunder rising sun liter propose, have employed the isosceles triangle visual beacon of the colour of multilayer nest as landing mark, this system is by ground monitoring station and UAV Communication, the landing ramp image of being caught by airborne camera by wireless image transmission module is sent to ground monitoring station, and carry out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, mobile system is sent to again through wireless data transfer module.These class methods add the restriction to landing place and application scenario undoubtedly, and transmission over radio also makes the reliability decrease of system.
Also have by Drawing Object identification and based on wireless signal strength range finding combine control aircraft landing method, if Shenzhen Dajiang Innovation Technology Co., Ltd.'s application number is the patent application of 201410234164.0, in the present invention then without the need to the distance between estimation and signal source.
Summary of the invention
In order to build one accurately and reliably unmanned machine aided method is fallen, the present invention utilizes at least 4 circular luminous body cloth to be set to ground cooperation mark, and utilize static nature and the behavioral characteristics identification of airborne camera photographic images, propose a kind of unmanned machine aided based on ground cooperation mark point recognition and fall visual guide method, comprise the steps:
Step 1: lay ground cooperation mark
Described ground cooperation mark comprises at least 4 solid circles luminarys and numbers respectively, and the diameter of each circular luminous body is at least 300mm, and the distance between each circular luminous bodily form heart is at least 3 times of circular luminous body radius; These circular luminous bodies are fixed in the dark-background of ground, and the shape making these circular luminous bodies form is from seeing to have directivity in the air, namely, when the shape formed when these circular luminous bodies carries out the integral-rotation being parallel to ground, numbering does not occur each circular luminous body obscures;
Make in these circular luminous bodies and be positioned at a peripheral circular luminous body as summit lamp; Scheduled landing position is set in lamp front, summit, and scheduled landing position is greater than its distance to other each circular luminous bodies to the distance of summit lamp, and the distance between scheduled landing position and lamp edge, summit is greater than unmanned aerial vehicle body extreme length;
The corresponding specific ground cooperation mark in each scheduled landing position, this ground cooperation mark comprises the circular luminous body of specific quantity and Rankine-Hugoniot relations;
Set up object library, in object library, prestore the Rankine-Hugoniot relations between the circular luminous body quantity that each ground cooperation mark comprises, the radius of each circular luminous body and numbering, each circular luminous body, comprise summit lamp numbering, position relationship between scheduled landing position and summit lamp in addition;
Airborne camera is arranged on unmanned plane, looks under the optical axis maintenance of airborne camera;
As preferably, described each solid circles luminary is coiled to form by ribbon LED.
As preferably, described each solid circles luminary is become by the LED bank light dish that generates heat.
As preferably, described airborne camera is near infrared pick up camera.
Step 2: unmanned plane hovers at predetermined altitude, start to help and fall guiding, confidence level is initialized as 0, and target tracking mark is initialized as 0, and potential luminary target number and luminary quantity to be identified are initialized as 0;
Continue to detect and judge whether ground has potential ground cooperation mark, and method is as follows:
To the every two field picture captured by airborne camera, detect wherein highlighted pixel, and adjacent high luminance pixels is merged into highlight regions; Using these highlight regions as potential luminary target, namely think it may is a luminary in ground cooperation mark;
Step 3: target quiescent identification:
To the image captured by the airborne camera of step 2, extract the static nature of each potential luminary target in this two field picture, comprise girth, radius, area, the position of form center of each potential luminary target, and judge whether it is a circular luminous body: if judge, this luminary is as a circular luminous body, then upgrade potential luminary target number; If judge it is not a circular luminous body, otherwise think that this potential luminary target is for interference; Until complete judgement to each potential luminary target in this two field picture;
Step 4: obtain scheduled landing position:
Step 4.1: the ground cooperation flag sign prestored in the object library set up static nature and the step 1 of each circular luminous body that step 3 obtains contrasts, to determine whether same target; Described target quiescent feature refers to the Rankine-Hugoniot relations between the quantity of circular luminous body and each circular luminous body: determination methods is:
(1) the potential luminary target number in present image is obtained according to step 3, the circular luminous body quantity comprised with each ground cooperation mark prestored in object library contrasts, and selects the ground cooperation mark equal with the potential luminary target number that step 3 obtains as target to be matched; If there is no the ground cooperation mark that prestores that quantity is equal, then return step 2 and again detect ground cooperation mark;
(2) static nature of each luminary target to be matched and step 3 obtained contrasts, and determines whether same target;
Step 4.2: if same target, then according to the information acquisition summit lamp numbering prestored in object library and position, namely obtain the centre of form transverse and longitudinal coordinate of summit lamp, and obtain the numbering of each circular luminous body, and be 1 by target tracking traffic sign placement, confidence level is set to T; The position relationship of the scheduled landing position corresponding to this ground cooperation mark prestored in the position of form center coordinate of summit lamp obtained according to present frame and object library and summit lamp, acquisition scheduled landing position coordinate;
If not same target, then return step 2 and again detect ground cooperation mark;
Step 5: judge whether target tracking mark equals 1; If target tracking mark equals 1, according to the scheduled landing position that unmanned plane obtains when forebody state parameter and step 4, calculate pitch deviation, the azimuth deviation of this scheduled landing position and camera optical axis, and send to the state of flight controller of unmanned plane, to control unmanned plane towards this scheduled landing position descending flight, and airborne camera is in real time from the image of this ground cooperation mark of airborne acquisition; Unmanned plane is worked as forebody state parameter and is comprised driftage, pitching, rolling parameter;
If target tracking mark is not equal to 1, then returns step 2 and again detect ground cooperation mark;
Step 6: carry out target prediction and tracking in unmanned plane decline process:
The every two field picture taken airborne camera in unmanned plane decline process calculates once all observed values and corresponding predictor thereof by the following method, line correlation sex determination of going forward side by side:
Step 6.1: the N continuous two field picture of the airborne camera shooting of flowing water buffer memory, N minimum requirements 3 frame, the area of each circular luminous body of whole Ms the circular luminous bodies of observed value included by ground cooperation mark of every two field picture, average gray and position of form center transverse and longitudinal coordinate, adopt binary linear regression method according to the area of the observed value of this N frame to each circular luminous body in next frame i.e. (N+1) two field picture, average gray and position of form center coordinate are predicted, namely the area of i-th circular luminous body of cooperation mark in ground in (N+1) frame is obtained, average gray, X-direction position of form center, Y-direction position of form center,
Step 6.2: correlativity judges:
From (N+1) frame, correlativity judgement is carried out to the area of each circular luminous body of present frame, average gray, X-direction position of form center, the corresponding predictor of this frame that observed value and the step 6.1 of Y-direction position of form center provide, when each observed value of this frame and corresponding predictor error are all less than predetermined threshold K, then think that this frame meets related condition; Otherwise not think and meet correlated condition;
Step 6.3: when this frame does not meet related condition, will work as previous belief and subtract 1; And by all 4M observed values of this frame, comprise the area of each circular luminous body, average gray, X-direction position of form center, Y-direction position of form center, all replace by the corresponding predictor of this frame, lamp position, summit is obtained for step 6.4, and iteration obtains the predictor of next frame again, described predictor is carried out step 6.1 Regress Forecast according to front N frame observed value and is obtained;
When this frame meets related condition, previous belief will be worked as and be updated to setting value T; Use all 4M observed values of this frame, obtain lamp position, summit for step 6.4, and iteration obtains the predictor of next frame again;
Step 6.4: when confidence level is greater than 0, is set to 1 by current goal tracking mark; Then according to the relation in this ground cooperation mark prestored in the position of form center coordinate of the summit lamp in present frame observed value and object library between summit lamp and scheduled landing position, calculate scheduled landing position; Then work as forebody state parameter and scheduled landing position according to unmanned plane, calculate the pitching of this scheduled landing position and camera optical axis, azimuth deviation; Control unmanned plane tracking scheduled landing position by state of flight controller and carry out descending flight, until unmanned plane drop to scheduled landing position; Unmanned plane is worked as forebody state parameter and is comprised driftage, pitching, rolling parameter;
When confidence level is less than or equal to 0, then think and correctly cannot follow the tracks of ground cooperation mark in lose objects and current flight, be now 0 by target tracking traffic sign placement, state of flight controller controls unmanned plane to be stopped declining, and returns step 2 and again detect ground cooperation mark.
Contrast prior art, beneficial effect of the present invention is: the present invention utilizes tool directive ground cooperation mark and to the target quiescent feature identification of airborne camera photographic images and target dynamic tracking prediction, realize carrying out vision guide to unmanned plane scheduled landing position, method is simple, error is little, the few alerting ability of ground cooperation mark layout constraints is large and support that multiple types of floors cooperation mark is to guide different landing position, also without the need to ground monitoring station, and the present invention can adopt eccentric tracking strategy, make unmanned plane scheduled landing position and ground cooperation mark have certain distance thus protect ground cooperation mark not to be damaged, effectively improve unmanned machine aided and fall boot efficiency and reliability, and do not limit by GPS and visbility.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the inventive method;
Fig. 2 is the ground cooperation mark schematic diagram be made up of 5 circular luminous bodies described in embodiment 1;
Fig. 3 is the ground cooperation mark schematic diagram be made up of 7 circular luminous bodies described in embodiment 1;
Fig. 4 falls vision guide mobile system pie graph based on the unmanned machine aided of ground cooperation mark.
Detailed description of the invention
Below in conjunction with drawings and Examples, the present invention is described in detail, also describe technical matters and the beneficial effect of technical solution of the present invention solution simultaneously, it is pointed out that described embodiment is only intended to be convenient to the understanding of the present invention, and any restriction effect is not play to it.
Be described explanation below in conjunction with embodiment to the detailed description of the invention that visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark point recognition of the present invention, as accompanying drawing 1, this method comprises the steps:
Step 1: lay ground cooperation mark
Described ground cooperation mark comprises the large solid circles luminary of at least 4 grades be coiled to form by LED strip lamp and numbers (see embodiment 1 and embodiment 2) respectively, the diameter of each circular luminous body is at least 300mm, and the distance between each circular luminous bodily form heart is at least 3 times of circular luminous body radius; These circular luminous bodies are fixed in the dark-background of ground, and the shape making these circular luminous bodies form is from seeing to have directivity in the air, namely, when the shape formed when these circular luminous bodies carries out the integral-rotation being parallel to ground, numbering does not occur each circular luminous body obscures;
Make in these circular luminous bodies and be positioned at a peripheral circular luminous body as summit lamp; Scheduled landing position is set in lamp front, summit, and scheduled landing position is greater than its distance to other each circular luminous bodies to the distance of summit lamp, and the distance between scheduled landing position and lamp edge, summit is greater than unmanned aerial vehicle body extreme length; Scheduled landing position and summit lamp keep certain distance to be to protect cooperation mark in ground not damaged when UAV Landing;
The corresponding specific ground cooperation mark in each scheduled landing position, this ground cooperation mark comprises the circular luminous body of specific quantity and Rankine-Hugoniot relations;
Set up object library, in object library, prestore the Rankine-Hugoniot relations between the circular luminous body quantity that each ground cooperation mark comprises, the radius of each circular luminous body and numbering, each circular luminous body, comprise summit lamp numbering, position relationship between scheduled landing position and summit lamp in addition;
Airborne camera is arranged on unmanned plane, under the optical axis of airborne camera keeps depending on and with perpendicular line, there are at least 1.5 ° of angles, and tilt to install to unmanned plane dead ahead, to carry out bias tracking.
As preferably, each circular luminous body is arranged on dark nylon cloth, is convenient to detection and Identification;
Step 2: unmanned plane hovers at predetermined altitude, start to help and fall guiding, confidence level is initialized as 0, and target tracking mark is initialized as 0, and the quantity of potential luminary target number and LED to be identified is initialized as 0;
Continue to detect and judge whether ground has potential ground cooperation mark, and method is as follows:
To the every two field picture captured by airborne camera, detect wherein highlighted pixel, and adjacent high luminance pixels is merged into highlight regions; Using these highlight regions as potential luminary target, namely think it may is a luminary in ground cooperation mark;
As preferably, described predetermined altitude interval is 100m ~ 150m, namely starts the detection carrying out ground cooperation mark during unmanned plane distance ground 100m ~ 150m height.Now ground about position error 2.5m.
Further, when predetermined altitude value is 100m ~ 150m, the luminosity of each circular luminous body is more than or equal to 113000lm; Airborne resolution ratio of camera head is 720*288, and pixel figure place is more than or equal to 8bit, and sensitivity is less than or equal to 0.2lux, and F is 1.4,9 ~ 20 degree, visual field.
As preferably, the described method adjacent high luminance pixels being merged into highlight regions is, first adjacent high luminance pixels in image often row is merged into an object, then will crossing object will be had in the ranks to merge.
Step 3: target quiescent identification:
To the image captured by the airborne camera of step 2, extract the static nature of each potential luminary target in this two field picture, comprise the area of each potential luminary target, position, and determine whether a circular luminous body into ground cooperation mark, method is as follows:
Step 3.1: the perimeter L obtaining i-th potential luminary target i, then this luminary radius is described girth is the number of pixels at this luminary edge in this two field picture;
Step 3.2: the centre of form calculating this luminary and with the centre of form of this luminary for center of circle Criterion equation of a circle the origin of coordinates is located at the upper left corner of image, and transverse direction is to the right x-axis positive dirction, vertical direction is downwards y-axis positive dirction;
As preferably, the method for calculating of the centre of form of each luminary is as follows:
x ‾ i = ∫ ∫ Q i x d x d y S i
y ‾ i = ∫ ∫ Q i y d x d y S i
Wherein, this luminary area by this luminary overlay area Q isize determine; .
Step 3.3: set luminary actual edge coordinate as (x t, y t), the error of mean square E between the abscissa calculating the standard round that this luminary edge abscissa and the centre of form are formed x, and the error of mean square E of the ordinate of standard round that this luminary edge ordinate and the centre of form are formed y:
E x = Σ x = x ‾ - R x ‾ + R [ Σ y = y ‾ - R y ‾ + R ( x - x t ) 2 ] L i ; E y = Σ x = x ‾ - R x ‾ + R [ Σ y = y ‾ - R y ‾ + R ( y - y t ) 2 ] L i ;
Calculate the maximum error E of the abscissa of the standard round that this luminary edge abscissa and the centre of form are formed max_x, and the maximum error E of the ordinate of standard round that this luminary edge ordinate and the centre of form are formed max_y:
E max _ x = max x ‾ - R i ≤ x ≤ x ‾ + R i y ‾ - R i ≤ y ≤ y ‾ + R i ( x - x t ) 2 ; E max _ y = max x ‾ - R i ≤ x ≤ x ‾ + R i y ‾ - R i ≤ y ≤ y ‾ + R i ( y - y t ) 2 ;
If E xand E yall be less than 1 and E max_xand E max_yall be less than or equal to 1, then judge a circular luminous body i.e. LED of this luminary as ground cooperation mark, and upgrade potential luminary target number, otherwise be judged to be interference; Until complete judgement to each potential luminary target in this two field picture;
Step 4: obtain scheduled landing position:
Step 4.1: the ground cooperation flag sign prestored in the object library set up target quiescent feature and step 1 contrasts, to determine whether same target; Described target quiescent feature refers to the Rankine-Hugoniot relations between the quantity of circular luminous body and each circular luminous body: determination methods is:
(1) the potential luminary target number in present image is obtained according to step 3, the circular luminous body quantity comprised with each ground cooperation mark prestored in object library contrasts, and selects the ground cooperation mark equal with the potential luminary target number that step 3 obtains as target to be matched; If there is no the ground cooperation mark that prestores that quantity is equal, then return step 2 and again detect ground cooperation mark;
(2) static nature of each luminary target to be matched and step 3 obtained contrasts, and determines whether same target;
As preferably, ask for following parameter and carry out contrasting thus judge each luminary that target to be matched and step 3 obtain whether as same target:
1. the distance between each circular luminous bodily form heart;
2. the luminary quantity on the line between each circular luminous bodily form heart;
3. the angle of line between each circular luminous build heart.
Step 4.2: if same target, then according to the information acquisition summit lamp numbering prestored in object library and position, namely the centre of form transverse and longitudinal coordinate of summit lamp is obtained, and obtain the numbering of each circular luminous body, and be 1 by target tracking traffic sign placement, confidence level is set to T=100: the position relationship of the scheduled landing position corresponding to this ground cooperation mark prestored in the position of form center coordinate of summit lamp obtained according to present frame and object library and summit lamp, acquisition scheduled landing position coordinate;
If not same target, then return step 2 and again detect ground cooperation mark;
Step 5: judge whether target tracking mark equals 1; If target tracking mark equals 1, according to unmanned plane when the (driftage of forebody state parameter, pitching, rolling) and step 4 obtain scheduled landing position, calculate pitch deviation, the azimuth deviation of this scheduled landing position and camera optical axis, obtain (the driftage of UAV Flight Control parameter, pitching), and send to the state of flight controller of unmanned plane, to control unmanned plane towards this scheduled landing position descending flight, and airborne camera is in real time from the image of this ground cooperation mark of airborne acquisition;
If target tracking mark is not equal to 1, then returns step 2 and again detect ground cooperation mark;
Step 6:(identifies to prevent by mistake, reduces false alarm rate) in unmanned plane decline process, carry out target prediction and tracking by target dynamic feature: (being that image processor inside needs)
Because picture frame frequency in the present embodiment is 50Hz (pal normal video standard), frame period is short, the area of each circular luminous body lamp in the ground cooperation sign image of therefore Airborne camera shooting in unmanned plane descent, average gray and position have stronger correlativity, and Changing Pattern can be predicted in a short time.
The every two field picture taken airborne camera in unmanned plane decline process calculates once all observed values and corresponding predictor thereof by the following method, line correlation sex determination of going forward side by side:
Step 6.1: the N continuous two field picture of the airborne camera shooting of flowing water buffer memory, N minimum requirements 3 frame, preferred scheme is that N is more than or equal to 6 frames.The area of each circular luminous body of whole Ms the circular luminous bodies of observed value included by ground cooperation mark of every two field picture, average gray and position of form center transverse and longitudinal coordinate (optical axis system of axes), adopt binary linear regression method to predict the area of each circular luminous body in next frame i.e. (N+1) two field picture, average gray and position of form center coordinate according to the observed value of this N frame, method is as follows:
If the area of the jth frame of i-th circular luminous body, average gray and position of form center obserred coordinate value are difference S ij, G ij, X ij, Y ij, wherein average gray is g (x, y) is i-th circular luminous bulk area S in jth two field picture ijthe gray value of interior each pixel; X ij, Y ijbe respectively the transverse and longitudinal coordinate of the position of form center of i-th circular luminous body in jth frame;
Make S ijcoefficient be a si, b si, c si; G ijcoefficient be a gi, b gi, c gi; X ijcoefficient be a xi, b xi, c xi; Y ijcoefficient be a yi, b yi, c yi; (it should be noted that here: as long as these coefficients of long constant then i-th the circular luminous body of the constant window namely predicted of N are constant, thus do not have j) in subscript; Make sequence of frame number x 1=1,2 ..., N}, frame number square sequence x 2=Isosorbide-5-Nitrae ..., N 2} ;elimination of unknown is adopted to obtain coefficient a respectively according to following four prescription journeys si, b si, c si, a gi, b gi, c gi, a xi, b xi, c xi, a yi, b yi, c yi:
a S i Σ j = 1 N x 1 ( j ) + b S i Σ j = 1 N x 2 ( j ) + Nc S i = Σ j = 1 N S i j a S i Σ j = 1 N x 1 2 ( j ) + b S i Σ j = 1 N x 1 ( j ) x 2 ( j ) + c S i Σ j = 1 N x 1 ( j ) = Σ j = 1 N S i j x 1 ( j ) a S i Σ j = 1 N x 1 3 ( j ) + b S i Σ j = 1 N x 1 2 ( j ) x 2 ( j ) + c S i Σ j = 1 N x 1 2 ( j ) = Σ j = 1 N S i j x 1 2 ( j )
a g i Σ j = 1 N x 1 ( j ) + b g i Σ j = 1 N x 2 ( j ) + 6 c g i = Σ j = 1 N G i j a g i Σ j = 1 N x 1 2 ( j ) + b g i Σ j = 1 N x 1 ( j ) x 2 ( j ) + c g i Σ j = 1 N x 1 ( j ) = Σ j = 1 N G i j x 1 ( j ) a g i Σ j = 1 N x 1 3 ( j ) + b g i Σ j = 1 N x 1 2 ( j ) x 2 ( j ) + c g i Σ j = 1 N x 1 2 ( j ) = Σ j = 1 N C i j x 1 2 ( j )
a x i Σ j = 1 N x 1 ( j ) + b x i Σ j = 1 N x 2 ( j ) + Nc x i = Σ j = 1 N X i j a x i Σ j = 1 N x 1 2 ( j ) + b x i Σ j = 1 N x 1 ( j ) x 2 ( j ) + c x i Σ j = 1 N x 1 ( j ) = Σ j = 1 N X i j x 1 ( j ) a x i Σ j = 1 N x 1 3 ( j ) + b x i Σ j = 1 N x 1 2 ( j ) x 2 ( j ) + c x i Σ j = 1 N x 1 2 ( j ) = Σ j = 1 N X i j x 1 2 ( j )
a y i Σ j = 1 N x 1 ( j ) + b y i Σ j = 1 N x 2 ( j ) + Nc y i = Σ j = 1 N Y i j a y i Σ j = 1 N x 1 2 ( j ) + b y i Σ j = 1 N x 1 ( j ) x 2 ( j ) + c y i Σ j = 1 N x 1 ( j ) = Σ j = 1 N Y i j x 1 ( j ) a y i Σ j = 1 N x 1 3 ( j ) + b y i Σ j = 1 N x 1 2 ( j ) x 2 ( j ) + c y i Σ j = 1 N x 1 2 ( j ) = Σ j = 1 N Y i j x 1 2 ( j )
The area of i-th circular luminous body of cooperation mark in ground in following Regress Forecast model acquisition (N+1) frame, average gray, X-direction position of form center, Y-direction position of form center is then adopted to be respectively:
S i ( N + 1 ) = ( N + 1 ) a S i + ( N + 1 ) 2 b S i + c S i G i ( N + 1 ) = ( N + 1 ) a g i + ( N + 1 ) 2 b g i + c g i X i ( N + 1 ) = ( N + 1 ) a x i + ( N + 1 ) 2 b x i + c x i Y i ( N + 1 ) = ( N + 1 ) a y i + ( N + 1 ) 2 b y i + c y i
Step 6.2: correlativity judges:
From (N+1) frame, correlativity judgement is carried out to the area of each circular luminous body of present frame, average gray, X-direction position of form center, the corresponding predictor of this frame that observed value and the above-mentioned Regress Forecast model of Y-direction position of form center provide, when each observed value of this frame and corresponding predictor error are all less than K=20%, then think that this frame meets related condition; Otherwise not think and meet correlated condition; The method of calculating of described observed value and corresponding predictor error is: (observed value-predictor)/predictor;
Step 6.3: when this frame does not meet related condition, will work as previous belief and subtract 1; And by all 4M observed values of this frame, comprise the area of each circular luminous body, average gray, X-direction position of form center, Y-direction position of form center, all replace by the corresponding predictor of this frame, lamp position, summit is obtained for step 6.4, and iteration obtains the predictor of next frame again, described predictor is carried out step 6.1 Regress Forecast according to front N frame observed value and is obtained;
When this frame meets related condition, previous belief will be worked as and be updated to setting value T=100; Use all 4M observed values of this frame, obtain lamp position, summit for step 6.4, and iteration obtains the predictor of next frame again;
Step 6.4: when confidence level is greater than 0, is set to 1 by current goal tracking mark; Then according to the relation in this ground cooperation mark prestored in the position of form center coordinate of the summit lamp in present frame observed value and object library between summit lamp and scheduled landing position, calculate scheduled landing position; Then work as forebody state parameter (driftage, pitching, rolling) and scheduled landing position according to unmanned plane, calculate the pitching of this scheduled landing position and camera optical axis, azimuth deviation; Control unmanned plane tracking scheduled landing position by state of flight controller and carry out descending flight, until unmanned plane drop to scheduled landing position;
When confidence level is less than or equal to 0, then think and correctly cannot follow the tracks of ground cooperation mark in lose objects and current flight, be now 0 by target tracking traffic sign placement, state of flight controller controls unmanned plane to be stopped declining, and returns step 2 and again detect ground cooperation mark;
When unmanned plane closely ground cooperation mark, pick up camera may clap entirely not all LED, and thus the guiding in last stage just guides according to predictor.
According to this method, unidentified signal to noise ratio is little or frame losing or lamp failure time, confidence level is set to 0.
In the present invention, cooperation mark in ground is mainly convenient to image processing module detection, identifies, is followed the tracks of and unmanned plane landing.Because technical requirements is all weather operation, therefore mark and ground contrast are larger better, and thus in the present invention, cooperation mark in ground is all laid in dark-background as the dark nylon cloth of ground laying is fixed.Provide two kinds of ground cooperation labeling scheme in following examples, the arrangement of circular luminous body wherein all has directivity.
Embodiment 1:
As shown in Figure 2, ground cooperation mark comprises 5 circular luminous bodies, and wherein 4 circular luminous bodies are arranged in square, 1 circular luminous body be positioned at this square outside, and make this circular luminous body as summit lamp.Distance between scheduled landing place-centric and the summit lamp centre of form is set as Δ L.
Each circular luminous body is the solid luminary of the diameter 300mm be coiled into LED strip lamp, as above forms arrowhead form after constraint arrangement, is convenient to lamp and the summit lamp at Tracking Recognition arrow top.Each LED passes through storage battery power supply with parallel way.
Embodiment 2:
As shown in Figure 3, ground cooperation mark comprises 7 circular luminous bodies, and each circular luminous body such as to be at the large solid LED, and wherein No. 1 LED is summit lamp and its radius is R,
1. the distance of No. 2 LED and No. 1 LED is L 1-2=6.8R, the distance of No. 3 LED and No. 1 LED is L 1-3=4.8R, the distance of No. 4 LED and No. 1 LED is L 1-4=5.6R, the distance of No. 5 LED and No. 1 LED is L 1-5=13.6R, the distance of No. 6 LED and No. 1 LED is L 1-6=9.6R, the distance of No. 7 LED and No. 1 LED is L 1-7=11.2R; Described distance is the distance between each LED centre of form;
2. the centre of form of No. 1 LED and No. 2 LED and No. 5 LED point-blank and make this straight line be LINE1; The centre of form of No. 1 LED and No. 3 LED and No. 6 LED point-blank and make this straight line be LINE3; The centre of form of No. 1 LED and No. 4 LED and No. 7 LED point-blank and make this straight line be LINE2; The centre of form of No. 2 LED and No. 3 LED and No. 4 LED point-blank and make this straight line be LINE3), the centre of form of No. 5 LED and No. 6 LED and No. 7 LED point-blank and make this straight line be LINE4; During identification, on 2,3,4 mid points, error is in 2 pixels;
3. the angle of LINE1 and LINE2 is 75 °, the angle of LINE1 and LINE3 is 45 °, the angle of LINE2 and LINE3 is 30 °, LINE1 is 45 ° with the angle of LINE4, LINE5 respectively, LINE2 is 60 ° with the angle of LINE4, LINE5 respectively, LINE3 is 90 ° with the angle of LINE4, LINE5 respectively, and LINE4, LINE5 are parallel; During identification, angle error is less than 3 ° and thinks and satisfy condition;
4. scheduled landing position is arranged on the extended line of LINE3, and the distance between scheduled landing place-centric and No. 1 LED centre of form is set as Δ L.
Further, the present invention can adopt mobile system to realize:
A vision guide mobile system falls in the unmanned machine aided based on ground cooperation mark, comprise connect successively airborne camera, A/D module, signal transacting and time-sequence control module, communication module, image compression module; Wherein:
Airborne camera is fixed on unmanned plane nacelle exterior, look, and preferred scheme is that camera optical axis and perpendicular line have at least 1.5 ° of angles, and tilts to install to unmanned plane dead ahead under the optical axis maintenance of airborne camera;
A/D module comprises the differential buffer amplifier, ADC, the photoisolator that connect successively, also comprises constant current source and comparator; Comparator starts constant current source and powers to airborne camera;
Signal transacting and time-sequence control module comprise signal processing unit, and are connected to FPGA, FLASH and the PROM on this signal processing unit; Wherein FPGA be used for image AD gather and data buffer storage, and for airborne camera, ADC and signal processing unit read-write sequential and logic control are provided; Signal processing unit has been used for image characteristics extraction, target identification and predicting tracing and has calculated; PROM is FPGA program store;
Communication module is used for carrying out data exchange with the state of flight controller of image compression module and unmanned plane; As preferably, between communication module and image compression module, adopt LVDS to carry out communication, and adopt full duplex RS422 agreement between the state of flight controller of unmanned plane and carry out communication through Phototube Coupling.
The image that image compression module is used for gathering compresses and stores.
Vision guide mobile system falls in the described unmanned machine aided based on ground cooperation mark, and workflow is as follows:
Utilize temperature in thermometric diode measurement engine room zero people, by constant current source for thermometric diode provides constant current, thermometric diode both end voltage is compared by comparator, when constant temperature diode temperature reach setting value (such as operating temperature be set in be greater than negative 40 DEG C and be less than 50 DEG C) time, comparator export high level be that airborne camera is powered; FPGA provides major clock, integral signal, channel selecting signal for airborne camera; The analog differential vision signal that airborne camera exports, become to send into FPGA after digital image and carry out buffer memory through A/D module converts, FPGA is airborne camera, the read-write of ADC, signal processing unit provides sequential and logic control; When frame synchronization and FPGA buffer memory setting data, FPGA sends out interruption to signal processing unit, and signal processing unit response is interrupted, when response frame interrupts, then and all data initializations will frame is set start; When response readings is according to interruption, then read data, carry out detection identification and the tracking of ground cooperation mark, and export scheduled landing position to state of flight controller by communication module; View data is exported by communication module and sends to image compression module by FPGA.
In embodiment, FPGA synthesizes pal mode sequential, for A/D module provides clock, receives the view data of A/D module converts, realizes RS232 communication protocol, and for DSP provides read-write sequence and logic control, instruction DSP reads.
The above; be only the specific embodiment of the present invention; but protection scope of the present invention is not limited thereto; any people being familiar with this technology is in the technical scope disclosed by the present invention; the conversion and replacement expected can be understood; all should be encompassed in and of the present inventionly comprise within scope, therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.

Claims (17)

1. a visual guide method falls in the unmanned machine aided based on ground cooperation mark, it is characterized in that, comprises the steps:
Step 1: lay ground cooperation mark
Described ground cooperation mark comprises at least 4 solid circles luminarys and numbers respectively, and the diameter of each circular luminous body is at least 300mm, and the distance between each circular luminous bodily form heart is at least 3 times of circular luminous body radius; These circular luminous bodies are fixed in the dark-background of ground, and the shape making these circular luminous bodies form is from seeing to have directivity in the air, namely, when the shape formed when these circular luminous bodies carries out the integral-rotation being parallel to ground, numbering does not occur each circular luminous body obscures;
Make in these circular luminous bodies and be positioned at a peripheral circular luminous body as summit lamp; Scheduled landing position is set in lamp front, summit, and scheduled landing position is greater than its distance to other each circular luminous bodies to the distance of summit lamp, and the distance between scheduled landing position and lamp edge, summit is greater than unmanned aerial vehicle body extreme length;
The corresponding specific ground cooperation mark in each scheduled landing position, this ground cooperation mark comprises the circular luminous body of specific quantity and Rankine-Hugoniot relations;
Set up object library, in object library, prestore the Rankine-Hugoniot relations between the circular luminous body quantity that each ground cooperation mark comprises, the radius of each circular luminous body and numbering, each circular luminous body, comprise summit lamp numbering, position relationship between scheduled landing position and summit lamp in addition;
Airborne camera is arranged on unmanned plane, looks under the optical axis maintenance of airborne camera;
Step 2: unmanned plane hovers at predetermined altitude, start to help and fall guiding, confidence level is initialized as 0, and target tracking mark is initialized as 0, and potential luminary target number and luminary quantity to be identified are initialized as 0;
Continue to detect and judge whether ground has potential ground cooperation mark, and method is as follows:
To the every two field picture captured by airborne camera, detect wherein highlighted pixel, and adjacent high luminance pixels is merged into highlight regions; Using these highlight regions as potential luminary target, namely think it may is a luminary in ground cooperation mark;
Step 3: target quiescent identification:
To the image captured by the airborne camera of step 2, extract the static nature of each potential luminary target in this two field picture, comprise girth, radius, area, the position of form center of each potential luminary target, and judge whether it is a circular luminous body: if judge, this luminary is as a circular luminous body, then upgrade potential luminary target number; If judge it is not a circular luminous body, otherwise think that this potential luminary target is for interference; Until complete judgement to each potential luminary target in this two field picture;
Step 4: obtain scheduled landing position:
Step 4.1: the ground cooperation flag sign prestored in the object library set up static nature and the step 1 of each circular luminous body that step 3 obtains contrasts, to determine whether same target; Described target quiescent feature refers to the Rankine-Hugoniot relations between the quantity of circular luminous body and each circular luminous body: determination methods is:
(1) the potential luminary target number in present image is obtained according to step 3, the circular luminous body quantity comprised with each ground cooperation mark prestored in object library contrasts, and selects the ground cooperation mark equal with the potential luminary target number that step 3 obtains as target to be matched; If there is no the ground cooperation mark that prestores that quantity is equal, then return step 2 and again detect ground cooperation mark;
(2) static nature of each luminary target to be matched and step 3 obtained contrasts, and determines whether same target;
Step 4.2: if same target, then according to the information acquisition summit lamp numbering prestored in object library and position, namely obtain the centre of form transverse and longitudinal coordinate of summit lamp, and obtain the numbering of each circular luminous body, and be 1 by target tracking traffic sign placement, confidence level is set to T; The position relationship of the scheduled landing position corresponding to this ground cooperation mark prestored in the position of form center coordinate of summit lamp obtained according to present frame and object library and summit lamp, acquisition scheduled landing position coordinate;
If not same target, then return step 2 and again detect ground cooperation mark;
Step 5: judge whether target tracking mark equals 1; If target tracking mark equals 1, according to the scheduled landing position that unmanned plane obtains when forebody state parameter and step 4, calculate pitch deviation, the azimuth deviation of this scheduled landing position and camera optical axis, and send to the state of flight controller of unmanned plane, to control unmanned plane towards this scheduled landing position descending flight, and airborne camera is in real time from the image of this ground cooperation mark of airborne acquisition; Unmanned plane is worked as forebody state parameter and is comprised driftage, pitching, rolling parameter;
If target tracking mark is not equal to 1, then returns step 2 and again detect ground cooperation mark;
Step 6: carry out target prediction and tracking in unmanned plane decline process:
The every two field picture taken airborne camera in unmanned plane decline process calculates once all observed values and corresponding predictor thereof by the following method, line correlation sex determination of going forward side by side:
Step 6.1: the N continuous two field picture of the airborne camera shooting of flowing water buffer memory, N minimum requirements 3 frame, the area of each circular luminous body of whole Ms the circular luminous bodies of observed value included by ground cooperation mark of every two field picture, average gray and position of form center transverse and longitudinal coordinate, adopt binary linear regression method according to the area of the observed value of this N frame to each circular luminous body in next frame i.e. (N+1) two field picture, average gray and position of form center coordinate are predicted, namely the area of i-th circular luminous body of cooperation mark in ground in (N+1) frame is obtained, average gray, X-direction position of form center, Y-direction position of form center,
Step 6.2: correlativity judges:
From (N+1) frame, correlativity judgement is carried out to the area of each circular luminous body of present frame, average gray, X-direction position of form center, the corresponding predictor of this frame that observed value and the step 6.1 of Y-direction position of form center provide, when each observed value of this frame and corresponding predictor error are all less than predetermined threshold K, then think that this frame meets related condition; Otherwise not think and meet correlated condition;
Step 6.3: when this frame does not meet related condition, will work as previous belief and subtract 1; And by all 4M observed values of this frame, comprise the area of each circular luminous body, average gray, X-direction position of form center, Y-direction position of form center, all replace by the corresponding predictor of this frame, lamp position, summit is obtained for step 6.4, and iteration obtains the predictor of next frame again, described predictor is carried out step 6.1 Regress Forecast according to front N frame observed value and is obtained;
When this frame meets related condition, previous belief will be worked as and be updated to setting value T; Use all 4M observed values of this frame, obtain lamp position, summit for step 6.4, and iteration obtains the predictor of next frame again;
Step 6.4: when confidence level is greater than 0, is set to 1 by current goal tracking mark; Then according to the relation in this ground cooperation mark prestored in the position of form center coordinate of the summit lamp in present frame observed value and object library between summit lamp and scheduled landing position, calculate scheduled landing position; Then work as forebody state parameter and scheduled landing position according to unmanned plane, calculate the pitching of this scheduled landing position and camera optical axis, azimuth deviation; Control unmanned plane tracking scheduled landing position by state of flight controller and carry out descending flight, until unmanned plane drop to scheduled landing position; Unmanned plane is worked as forebody state parameter and is comprised driftage, pitching, rolling parameter;
When confidence level is less than or equal to 0, then think and correctly cannot follow the tracks of ground cooperation mark in lose objects and current flight, be now 0 by target tracking traffic sign placement, state of flight controller controls unmanned plane to be stopped declining, and returns step 2 and again detect ground cooperation mark.
2. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 1, it is characterized in that, in step 2, described predetermined altitude interval is 100m ~ 150m, namely starts the detection carrying out ground cooperation mark during unmanned plane distance ground 100m ~ 150m height.
3. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 2, it is characterized in that, in step 2, when predetermined altitude value is 100m ~ 150m, the luminosity of each circular luminous body is more than or equal to 113000lm; Airborne resolution ratio of camera head is 720*288, and pixel figure place is more than or equal to 8bit, and sensitivity is less than or equal to 0.2lux, and F is 1.4,9 ~ 20 degree, visual field.
4. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 1, it is characterized in that, the method described in step 2, adjacent high luminance pixels being merged into highlight regions is, first adjacent high luminance pixels in image often row is merged into an object, then will will crossing object is had to merge in the ranks.
5. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 1, it is characterized in that, described each solid circles luminary is coiled to form by ribbon LED.
6. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 1, it is characterized in that, ground cooperation mark comprises 5 circular luminous bodies, and wherein 4 circular luminous bodies are arranged in square, 1 circular luminous body be positioned at this square outside, and making this circular luminous body as summit lamp, the distance between scheduled landing place-centric and the summit lamp centre of form is set as Δ L.
7. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 1, it is characterized in that, ground cooperation mark comprises 7 circular luminous bodies, and each circular luminous body such as to be at the large solid LED, wherein No. 1 LED is summit lamp and its radius is R
1. the distance of No. 2 LED and No. 1 LED is L 1-2=6.8R, the distance of No. 3 LED and No. 1 LED is L 1-3=4.8R, the distance of No. 4 LED and No. 1 LED is L 1-4=5.6R, the distance of No. 5 LED and No. 1 LED is L 1-5=13.6R, the distance of No. 6 LED and No. 1 LED is L 1-6=9.6R, the distance of No. 7 LED and No. 1 LED is L 1-7=11.2R; Described distance is the distance between each LED centre of form;
2. the centre of form of No. 1 LED and No. 2 LED and No. 5 LED point-blank and make this straight line be LINE1; The centre of form of No. 1 LED and No. 3 LED and No. 6 LED point-blank and make this straight line be LINE3; The centre of form of No. 1 LED and No. 4 LED and No. 7 LED point-blank and make this straight line be LINE2; The centre of form of No. 2 LED and No. 3 LED and No. 4 LED point-blank and make this straight line be LINE3), the centre of form of No. 5 LED and No. 6 LED and No. 7 LED point-blank and make this straight line be LINE4;
3. the angle of LINE1 and LINE2 is 75 °, the angle of LINE1 and LINE3 is 45 °, the angle of LINE2 and LINE3 is 30 °, LINE1 is 45 ° with the angle of LINE4, LINE5 respectively, LINE2 is 60 ° with the angle of LINE4, LINE5 respectively, LINE3 is 90 ° with the angle of LINE4, LINE5 respectively, and LINE4, LINE5 are parallel;
4. scheduled landing position is arranged on the extended line of LINE3, and the distance between scheduled landing place-centric and No. 1 LED centre of form is set as Δ L.
8. visual guide method falls in any one the unmanned machine aided based on ground cooperation mark according to claim 1-7, it is characterized in that, step 3 judges that whether potential luminary target is that the method for a circular luminous body is as follows:
Step 3.1: the perimeter L obtaining i-th potential luminary target i, then this luminary radius is described girth is the number of pixels at this luminary edge in this two field picture;
Step 3.2: the centre of form calculating this luminary and with the centre of form of this luminary for center of circle Criterion equation of a circle the origin of coordinates is located at the upper left corner of image, and transverse direction is to the right x-axis positive dirction, vertical direction is downwards y-axis positive dirction;
Step 3.3: set luminary actual edge coordinate as (x t, y t), the error of mean square E between the abscissa calculating the standard round that this luminary edge abscissa and the centre of form are formed x, and the error of mean square E of the ordinate of standard round that this luminary edge ordinate and the centre of form are formed y:
Calculate the maximum error E of the abscissa of the standard round that this luminary edge abscissa and the centre of form are formed max_x, and the maximum error E of the ordinate of standard round that this luminary edge ordinate and the centre of form are formed max_y:
If E xand E yall be less than 1 and E max_xand E max_yall be less than or equal to 1, then judge that this luminary is as a circular luminous body, and upgrade potential luminary target number, otherwise be judged to be interference; Until complete judgement to each potential luminary target in this two field picture.
9. visual guide method falls in any one the unmanned machine aided based on ground cooperation mark according to claim 1-7, it is characterized in that, in step 6.1, adopt binary linear regression method to predict the area of each circular luminous body in next frame i.e. (N+1) two field picture, average gray and position of form center coordinate according to the observed value of N frame, method is as follows:
If the area of the jth frame of i-th circular luminous body, average gray and position of form center obserred coordinate value are difference S ij, G ij, X ij, Y ij, wherein average gray is g (x, y) is i-th circular luminous bulk area S in jth two field picture ijthe gray value of interior each pixel; X ij, Y ijbe respectively the transverse and longitudinal coordinate of the position of form center of i-th circular luminous body in jth frame;
Make S ijcoefficient be a si, b si, c si; G ijcoefficient be a gi, b gi, c gi; X ijcoefficient be a xi, b xi, c xi; Y ijcoefficient be a yi, b yi, c yi; Make sequence of frame number x 1=1,2 ..., N}, frame number square sequence x 2=Isosorbide-5-Nitrae ..., N 2; Elimination of unknown is adopted to obtain coefficient a respectively according to following four prescription journeys si, b si, c si, a gi, b gi, c gi, a xi, b xi, c xi, a yi, b yi, c yi:
The area of i-th circular luminous body of cooperation mark in ground in following Regress Forecast model acquisition (N+1) frame, average gray, X-direction position of form center, Y-direction position of form center is then adopted to be respectively:
10. visual guide method falls in a kind of unmanned machine aided based on ground cooperation mark according to claim 1, and it is characterized in that, in step 6.1, N is more than or equal to 6 frames.
Visual guide method falls in 11. a kind of unmanned machine aideds based on ground cooperation mark according to claim 8, and it is characterized in that, in step 3.2, the method for calculating of the centre of form of each luminary is as follows:
Wherein, this luminary area by this luminary overlay area Q isize determine.
Visual guide method falls in 12. any one unmanned machine aided based on ground cooperation mark according to claim 1-7, it is characterized in that, in step 4.1 (2), ask for following parameter and carry out contrasting thus judge each luminary that target to be matched and step 3 obtain whether as same target:
1. the distance between each circular luminous bodily form heart;
2. the luminary quantity on the line between each circular luminous bodily form heart;
3. the angle of line between each circular luminous build heart.
Visual guide method falls in 13. any one unmanned machine aided based on ground cooperation mark according to claim 1-7, it is characterized in that, in step 6.2, the method for calculating of described observed value and corresponding predictor error is: (observed value-predictor)/predictor.
Visual guide method falls in 14. a kind of unmanned machine aideds based on ground cooperation mark according to claim 1, and it is characterized in that, each circular luminous body is arranged on dark nylon cloth.
Vision guide falls in 15. a kind of unmanned machine aideds based on ground cooperation mark according to claim 1
Method, is characterized in that, predetermined threshold K=20% in step 6.2.
Vision guide mobile system falls in 16. 1 kinds of unmanned machine aideds based on ground cooperation mark, it is characterized in that, comprise connect successively airborne camera, A/D module, signal transacting and time-sequence control module, communication module; Wherein:
Airborne camera is fixed on unmanned plane nacelle exterior, looks under the optical axis maintenance of airborne camera;
A/D module comprises the differential buffer amplifier, ADC, the photoisolator that connect successively, also comprises constant current source and comparator; Comparator starts constant current source and powers to airborne camera;
Signal transacting and time-sequence control module comprise signal processing unit, FPGA, FLASH, PROM, image compression module, and described signal processing unit is connected respectively with FPGA, FLASH, and FPGA is also connected with image compression module; Wherein FPGA be used for image AD gather and data buffer storage, and for airborne camera, ADC and signal processing unit read-write sequential and logic control are provided; Signal processing unit has been used for image characteristics extraction, target identification and predicting tracing and has calculated; PROM is FPGA program store; The image that image compression module is used for FPGA gathers compresses and stores;
The data that communication module is sent for sending image compression module, and carry out data exchange with the state of flight controller of unmanned plane.
17. fall vision guide mobile system according to the unmanned machine aided based on ground cooperation mark a kind of described in claim 16, it is characterized in that, adopt LVDS to carry out communication between communication module and image compression module, and adopt full duplex RS422 agreement between the state of flight controller of unmanned plane and carry out communication through Phototube Coupling.
CN201510496861.8A 2015-08-13 2015-08-13 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark Pending CN105000194A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510496861.8A CN105000194A (en) 2015-08-13 2015-08-13 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510496861.8A CN105000194A (en) 2015-08-13 2015-08-13 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark

Publications (1)

Publication Number Publication Date
CN105000194A true CN105000194A (en) 2015-10-28

Family

ID=54373127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510496861.8A Pending CN105000194A (en) 2015-08-13 2015-08-13 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark

Country Status (1)

Country Link
CN (1) CN105000194A (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105857582A (en) * 2016-04-06 2016-08-17 北京博瑞爱飞科技发展有限公司 Method and device for adjusting shooting angle, and unmanned air vehicle
CN106184758A (en) * 2016-09-18 2016-12-07 成都天麒科技有限公司 The automatic medicament feeding system of a kind of plant protection unmanned plane and method
CN106502257A (en) * 2016-10-25 2017-03-15 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands jamproof control method
CN106500699A (en) * 2016-05-25 2017-03-15 上海铸天智能科技有限公司 A kind of position and orientation estimation method suitable for Autonomous landing in unmanned plane room
CN106516145A (en) * 2016-12-16 2017-03-22 武汉理工大学 Rotor craft safe capturing device and method
CN106598075A (en) * 2016-07-21 2017-04-26 深圳曼塔智能科技有限公司 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification
CN106628211A (en) * 2017-03-16 2017-05-10 山东大学 Ground guiding type unmanned aerial vehicle flying landing system and method based on LED dot matrix
CN106791716A (en) * 2017-02-27 2017-05-31 四川豪特实业集团有限公司 Mobile robot's image capturing system
CN107065894A (en) * 2016-01-28 2017-08-18 松下电器(美国)知识产权公司 Unmanned vehicle, flight altitude control device, method and program
CN107451788A (en) * 2017-09-09 2017-12-08 厦门大壮深飞科技有限公司 Automatic delivering method and delivery station are concentrated in unmanned plane logistics based on independent navigation
CN107515622A (en) * 2017-07-27 2017-12-26 南京航空航天大学 A kind of rotor wing unmanned aerial vehicle autonomous control method of drop in mobile target
WO2018024069A1 (en) * 2016-08-04 2018-02-08 北京京东尚科信息技术有限公司 Method, device and system for guiding unmanned aerial vehicle to land
CN107985556A (en) * 2017-11-29 2018-05-04 天津聚飞创新科技有限公司 Unmanned plane hovering system and method
CN108694728A (en) * 2017-04-11 2018-10-23 北京乐普盛通信息技术有限公司 Unmanned plane guides landing method, apparatus and system
CN109032166A (en) * 2018-03-08 2018-12-18 李绪臣 Track the method for driving vehicle immediately based on unmanned plane
WO2018227350A1 (en) * 2017-06-12 2018-12-20 深圳市大疆创新科技有限公司 Control method for homeward voyage of unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
CN109407700A (en) * 2018-11-05 2019-03-01 周口师范学院 A kind of unmanned plane independent landing method and its implementing device guided using image
CN109445432A (en) * 2018-10-31 2019-03-08 中国科学技术大学 Unmanned plane and ground mobile robot formation localization method based on image
CN109460046A (en) * 2018-10-17 2019-03-12 吉林大学 A kind of unmanned plane identify naturally not with independent landing method
CN109521799A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 The flight control method and device identified based on land marking
CN109521787A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Method and device for aircraft mark landing
CN109521789A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Identification method and device
CN109911237A (en) * 2019-04-02 2019-06-21 赵嘉睿 Based on ultraviolet light to the unmanned machine aided drop and guidance system and application of empty coded beacons
CN110861779A (en) * 2019-12-02 2020-03-06 中电科特种飞机系统工程有限公司 Carrier landing system and method for vertical take-off and landing unmanned aerial vehicle
CN111325752A (en) * 2018-12-17 2020-06-23 北京华航无线电测量研究所 Visual auxiliary method for accurate landing and dynamic pose adjustment of helicopter
CN112455705A (en) * 2020-12-04 2021-03-09 南京晓飞智能科技有限公司 Unmanned aerial vehicle autonomous accurate landing system and method
CN112651079A (en) * 2020-12-17 2021-04-13 陕西宝成航空仪表有限责任公司 Angle simulation resolving device for electromechanical airborne product
CN113066050A (en) * 2021-03-10 2021-07-02 天津理工大学 Method for resolving course attitude of airdrop cargo bed based on vision
TWI746973B (en) * 2018-05-09 2021-11-21 大陸商北京外號信息技術有限公司 Method for guiding a machine capable of autonomous movement through optical communication device
WO2022040942A1 (en) * 2020-08-25 2022-03-03 深圳市大疆创新科技有限公司 Flight positioning method, unmanned aerial vehicle and storage medium
CN114596491A (en) * 2022-03-03 2022-06-07 北京新科汇智科技发展有限公司 Unmanned aerial vehicle induction method and system
CN112455705B (en) * 2020-12-04 2024-05-03 南京晓飞智能科技有限公司 Unmanned aerial vehicle autonomous accurate landing system and method

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065894B (en) * 2016-01-28 2021-11-26 松下电器(美国)知识产权公司 Unmanned aerial vehicle, flying height control device, method, and computer-readable recording medium
CN107065894A (en) * 2016-01-28 2017-08-18 松下电器(美国)知识产权公司 Unmanned vehicle, flight altitude control device, method and program
CN105857582A (en) * 2016-04-06 2016-08-17 北京博瑞爱飞科技发展有限公司 Method and device for adjusting shooting angle, and unmanned air vehicle
CN106500699A (en) * 2016-05-25 2017-03-15 上海铸天智能科技有限公司 A kind of position and orientation estimation method suitable for Autonomous landing in unmanned plane room
CN106500699B (en) * 2016-05-25 2019-06-18 上海铸天智能科技有限公司 A kind of position and orientation estimation method suitable for Autonomous landing in unmanned plane room
CN106598075A (en) * 2016-07-21 2017-04-26 深圳曼塔智能科技有限公司 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification
AU2017306320B2 (en) * 2016-08-04 2020-07-02 Beijing Jingdong Qianshi Technology Co., Ltd. Method, device and system for guiding unmanned aerial vehicle to land
US11383857B2 (en) 2016-08-04 2022-07-12 Beijing Jingdong Qianshi Technology Co., Ltd. Method, device and system for guiding unmanned aerial vehicle to land
WO2018024069A1 (en) * 2016-08-04 2018-02-08 北京京东尚科信息技术有限公司 Method, device and system for guiding unmanned aerial vehicle to land
CN106184758A (en) * 2016-09-18 2016-12-07 成都天麒科技有限公司 The automatic medicament feeding system of a kind of plant protection unmanned plane and method
CN106184758B (en) * 2016-09-18 2018-07-20 成都天麒科技有限公司 A kind of automatic medicament feeding system and method for plant protection drone
CN106502257B (en) * 2016-10-25 2020-06-02 南京奇蛙智能科技有限公司 Anti-interference control method for precise landing of unmanned aerial vehicle
CN106502257A (en) * 2016-10-25 2017-03-15 南京奇蛙智能科技有限公司 A kind of unmanned plane precisely lands jamproof control method
CN106516145A (en) * 2016-12-16 2017-03-22 武汉理工大学 Rotor craft safe capturing device and method
CN106516145B (en) * 2016-12-16 2018-08-07 武汉理工大学 Rotor craft safely captures device and catching method
CN106791716A (en) * 2017-02-27 2017-05-31 四川豪特实业集团有限公司 Mobile robot's image capturing system
CN106628211B (en) * 2017-03-16 2019-02-26 山东大学 Ground control formula unmanned plane during flying landing system and method based on LED dot matrix
CN106628211A (en) * 2017-03-16 2017-05-10 山东大学 Ground guiding type unmanned aerial vehicle flying landing system and method based on LED dot matrix
CN108694728A (en) * 2017-04-11 2018-10-23 北京乐普盛通信息技术有限公司 Unmanned plane guides landing method, apparatus and system
WO2018227350A1 (en) * 2017-06-12 2018-12-20 深圳市大疆创新科技有限公司 Control method for homeward voyage of unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
CN107515622A (en) * 2017-07-27 2017-12-26 南京航空航天大学 A kind of rotor wing unmanned aerial vehicle autonomous control method of drop in mobile target
CN107451788A (en) * 2017-09-09 2017-12-08 厦门大壮深飞科技有限公司 Automatic delivering method and delivery station are concentrated in unmanned plane logistics based on independent navigation
CN107985556A (en) * 2017-11-29 2018-05-04 天津聚飞创新科技有限公司 Unmanned plane hovering system and method
CN109032166A (en) * 2018-03-08 2018-12-18 李绪臣 Track the method for driving vehicle immediately based on unmanned plane
US11338920B2 (en) 2018-05-09 2022-05-24 Beijing Whyhow Information Technology Co., Ltd. Method for guiding autonomously movable machine by means of optical communication device
TWI746973B (en) * 2018-05-09 2021-11-21 大陸商北京外號信息技術有限公司 Method for guiding a machine capable of autonomous movement through optical communication device
CN109521799A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 The flight control method and device identified based on land marking
CN109521787A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Method and device for aircraft mark landing
CN109521789A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Identification method and device
CN109460046B (en) * 2018-10-17 2021-08-06 吉林大学 Unmanned aerial vehicle natural landmark identification and autonomous landing method
CN109460046A (en) * 2018-10-17 2019-03-12 吉林大学 A kind of unmanned plane identify naturally not with independent landing method
CN109445432A (en) * 2018-10-31 2019-03-08 中国科学技术大学 Unmanned plane and ground mobile robot formation localization method based on image
CN109407700A (en) * 2018-11-05 2019-03-01 周口师范学院 A kind of unmanned plane independent landing method and its implementing device guided using image
CN111325752A (en) * 2018-12-17 2020-06-23 北京华航无线电测量研究所 Visual auxiliary method for accurate landing and dynamic pose adjustment of helicopter
CN111325752B (en) * 2018-12-17 2023-06-13 北京华航无线电测量研究所 Helicopter accurate landing and dynamic pose adjustment vision auxiliary method
CN109911237A (en) * 2019-04-02 2019-06-21 赵嘉睿 Based on ultraviolet light to the unmanned machine aided drop and guidance system and application of empty coded beacons
CN109911237B (en) * 2019-04-02 2022-03-22 赵嘉睿 Unmanned aerial vehicle landing assisting and guiding system based on ultraviolet air coding beacon and application
CN110861779A (en) * 2019-12-02 2020-03-06 中电科特种飞机系统工程有限公司 Carrier landing system and method for vertical take-off and landing unmanned aerial vehicle
WO2022040942A1 (en) * 2020-08-25 2022-03-03 深圳市大疆创新科技有限公司 Flight positioning method, unmanned aerial vehicle and storage medium
CN112455705A (en) * 2020-12-04 2021-03-09 南京晓飞智能科技有限公司 Unmanned aerial vehicle autonomous accurate landing system and method
CN112455705B (en) * 2020-12-04 2024-05-03 南京晓飞智能科技有限公司 Unmanned aerial vehicle autonomous accurate landing system and method
CN112651079A (en) * 2020-12-17 2021-04-13 陕西宝成航空仪表有限责任公司 Angle simulation resolving device for electromechanical airborne product
CN113066050A (en) * 2021-03-10 2021-07-02 天津理工大学 Method for resolving course attitude of airdrop cargo bed based on vision
CN114596491A (en) * 2022-03-03 2022-06-07 北京新科汇智科技发展有限公司 Unmanned aerial vehicle induction method and system

Similar Documents

Publication Publication Date Title
CN105000194A (en) UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
US11287835B2 (en) Geo-fiducials for UAV navigation
CN109885086B (en) Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance
CN107402396A (en) UAV Landing guiding system and method based on multimode navigation
CN109923492A (en) Flight path determines
CN109901580A (en) A kind of unmanned plane cooperates with unmanned ground robot follows diameter obstacle avoidance system and its method
CN105759829A (en) Laser radar-based mini-sized unmanned plane control method and system
CN105492985A (en) Multi-sensor environment map building
CN112130579A (en) Tunnel unmanned aerial vehicle inspection method and system
CN107850902A (en) Camera configuration in loose impediment
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN111413708A (en) Unmanned aerial vehicle autonomous landing site selection method based on laser radar
CN113050685B (en) Autonomous inspection method for underground unmanned aerial vehicle of coal mine
WO2019040179A1 (en) Controlling landings of an aerial robotic vehicle using three-dimensional terrain maps generated using visual-inertial odometry
CN111273679A (en) Visual-guided network-collision recovery longitudinal guidance method for small fixed-wing unmanned aerial vehicle
CN113156998A (en) Unmanned aerial vehicle flight control system and control method
Niu et al. SuperDock: A deep learning-based automated floating trash monitoring system
CN114867988A (en) Map comprising data for determining a route for an aircraft during a GNSS fault
CN110104167A (en) A kind of automation search and rescue UAV system and control method using infrared thermal imaging sensor
CN103697883A (en) Aircraft horizontal attitude determination method based on skyline imaging
Kamat et al. A survey on autonomous navigation techniques
CN207408598U (en) UAV Landing guiding system based on multimode navigation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20151028

WD01 Invention patent application deemed withdrawn after publication