CN205028160U - Measurement resolver and controlling means that unmanned aerial vehicle independently landed - Google Patents

Measurement resolver and controlling means that unmanned aerial vehicle independently landed Download PDF

Info

Publication number
CN205028160U
CN205028160U CN201520652253.7U CN201520652253U CN205028160U CN 205028160 U CN205028160 U CN 205028160U CN 201520652253 U CN201520652253 U CN 201520652253U CN 205028160 U CN205028160 U CN 205028160U
Authority
CN
China
Prior art keywords
unmanned plane
treating
touchdown area
relative position
landing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn - After Issue
Application number
CN201520652253.7U
Other languages
Chinese (zh)
Inventor
石一磊
王森林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Quanzhou Institute of Equipment Manufacturing
Original Assignee
Quanzhou Institute of Equipment Manufacturing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quanzhou Institute of Equipment Manufacturing filed Critical Quanzhou Institute of Equipment Manufacturing
Priority to CN201520652253.7U priority Critical patent/CN205028160U/en
Application granted granted Critical
Publication of CN205028160U publication Critical patent/CN205028160U/en
Withdrawn - After Issue legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The utility model provides a measurement resolver and controlling means that unmanned aerial vehicle independently landed. The measurement resolver that this unmanned aerial vehicle independently landed includes: the first unit that acquires for obtain the regional stereo image information of landing of waiting, the second acquisition unit is used for the basis wait the regional stereo image information acquisition relative position information of landing, wherein, relative position information be unmanned aerial vehicle with treat the touch -down zone and treat the relative position's of landing point information inside the country, and confirm the unit, be used for the basis relative position information is in it is regional definite to wait to land unmanned aerial vehicle's landing point. Through the utility model discloses, unmanned aerial vehicle's landing control accuracy has been improved.

Description

The measurement resolver of unmanned plane independent landing and control device
Technical field
The utility model relates to unmanned plane field, in particular to a kind of measurement resolver and control device of unmanned plane independent landing.
Background technology
Unmanned plane (UnmannedAerialVehicle, referred to as UAV) technology has been apply widely technology, and the landing problems of unmanned plane is also the important problem of in unmanned air vehicle technique.In the prior art, unmanned plane is when landing, the image acquisition equipment carried by unmanned plane is needed to obtain the image on ground, thus the control of landing is carried out by image, but, the unmanned plane of prior art is the image of two dimension at the image of time institute's foundation of landing, and for Landing Control, control accuracy is not high.
Be used for for unmanned air vehicle technique in correlation technique the problem that when UAV Landing controls, the control accuracy of existence is not high, not yet propose effective solution at present.
Utility model content
Fundamental purpose of the present utility model is to provide a kind of measurement analytic method of unmanned plane independent landing and device and control method and device, to solve the not high problem of the control accuracy that exists when unmanned air vehicle technique controls for UAV Landing in correlation technique.
To achieve these goals, according to an aspect of the present utility model, provide a kind of measurement resolver of unmanned plane independent landing, this device comprises: the first acquiring unit, for obtaining the stereo image information treating touchdown area; Second acquisition unit, for treating described in basis that the stereo image information of touchdown area obtains relative position information, wherein, described relative position information is unmanned plane and the described information treating the relative position treating landing point in touchdown area; Described, determining unit, for treating that touchdown area determines the landing point of described unmanned plane according to described relative position information.
Further, described second acquisition unit comprises: the first acquisition module, treats multiple coordinate treating landing point in touchdown area described in stereo image information and actual coordinates for treating touchdown area described in basis obtain; Second acquisition module, for obtaining the body coordinate of described unmanned plane according to the body axis system of described unmanned plane; 3rd acquisition module, multiplely in touchdown area treats that the coordinate of landing point and the body coordinate of described unmanned plane obtain described relative position information for treating described in basis.
Further, described device also comprises: the 3rd acquiring unit, for according to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, treat that the stereo image information of touchdown area obtains relative attitude information according to described, wherein, described determining unit is used for treating that touchdown area determines the landing point of described unmanned plane according to described relative position information and described relative attitude information described.
Further, described second acquisition unit be used for according to relative position algorithm, described in treat to treat in touchdown area that the translation-angle that the actual position coordinate of landing point and described actual coordinate are tied to described body axis system calculates relative position.The translation-angle that described 3rd acquiring unit is used for being tied to according to relative attitude algorithm and described actual coordinate described body axis system calculates relative attitude.Described determining unit is used for the relative position calculated described in basis and the relative attitude calculated from described multiple landing point treating to select landing point described unmanned plane.
Further, described device also comprises: recognition unit, for according to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, according to the described stereo image information identification treating touchdown area, treat the barrier in touchdown area.Described determining unit comprises: acquisition module, for treating the size of barrier in touchdown area described in obtaining; Judge module, for judging whether the size of described barrier is greater than pre-set dimension, wherein, described pre-set dimension is greater than the size of the body of described unmanned plane; Determination module, for when judging that the size of described barrier is greater than described pre-set dimension, then according to described relative position information using the landing point of described barrier as described unmanned plane; When judging that the size of described barrier is less than or equal to described pre-set dimension, according to described relative position information from the described landing point treating to select other landing points outside described barrier as described unmanned plane touchdown area.
The measurement resolver of above-mentioned unmanned plane independent landing, the method that this measurement resolver adopts comprises: obtain the stereo image information treating touchdown area; Treat that the stereo image information of touchdown area obtains relative position information according to described, wherein, described relative position information is unmanned plane and the described information treating the relative position treating landing point in touchdown area; Treat that touchdown area determines the landing point of described unmanned plane according to described relative position information described.
Treat that the stereo image information of touchdown area obtains described relative position information and comprises according to described: described in the stereo image information treating touchdown area according to described and actual coordinates obtain, treat multiple coordinate treating landing point in touchdown area; The body coordinate of described unmanned plane is obtained according to the body axis system of described unmanned plane; Multiplely in touchdown area treat that the coordinate of landing point and the body coordinate of described unmanned plane obtain described relative position information according to described treating.
According to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, described method also comprises: obtain relative attitude information according to the described stereo image information treating touchdown area.Treat that touchdown area determines that the landing point of described unmanned plane comprises according to described relative position information described: treat that touchdown area determines the landing point of described unmanned plane according to described relative position information and described relative attitude information described.
Obtain relative position information according to the described stereo image information treating touchdown area to comprise: according to relative position algorithm, described in treat to treat in touchdown area that the translation-angle that the actual position coordinate of landing point and described actual coordinate are tied to described body axis system calculates relative position.Treat that the stereo image information of touchdown area obtains relative attitude information and comprises according to described: the translation-angle being tied to described body axis system according to relative attitude algorithm and described actual coordinate calculates relative attitude.Treat that touchdown area determines that the landing point of described unmanned plane comprises according to described relative position information and described relative attitude information described: according to the relative position calculated and the relative attitude that calculates from described multiple landing point treating to select landing point described unmanned plane.
According to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, described method also comprises: according to the described stereo image information identification treating touchdown area, treat the barrier in touchdown area.Treat that touchdown area determines that the landing point of described unmanned plane comprises according to described relative position information described: the size treating barrier in touchdown area described in acquisition; Judge whether the size of described barrier is greater than pre-set dimension, wherein, described pre-set dimension is greater than the size of the body of described unmanned plane; If judge that the size of described barrier is greater than described pre-set dimension, then according to described relative position information using the landing point of described barrier as described unmanned plane; If the size of described barrier is less than or equal to described pre-set dimension, then according to described relative position information from the described landing point treating to select other landing points outside described barrier as described unmanned plane touchdown area.
To achieve these goals, according to another aspect of the present utility model, provide a kind of unmanned plane independent landing control device, this device comprises: measure resolution unit, for treating the landing point determining unmanned plane in touchdown area, wherein, the measurement analytic method provided according to the utility model is in the described landing point treating to determine in touchdown area described unmanned plane; Control module, lands for controlling the landing point that described unmanned plane is being determined.
Above-mentioned a kind of unmanned plane independent landing control method, the method comprises: treating the landing point determining unmanned plane in touchdown area, and wherein, the measurement analytic method provided according to the utility model is in the described landing point treating to determine in touchdown area described unmanned plane; Control the landing point that described unmanned plane is being determined to land.
Pass through the utility model, what employing acquired treats that the stereo image information of touchdown area is in the landing point treating touchdown area determination unmanned plane, solve the problem that the control accuracy of existence when unmanned air vehicle technique controls for UAV Landing in correlation technique is not high, and then reach the effect of the Landing Control precision improving unmanned plane.
Accompanying drawing explanation
The accompanying drawing forming a application's part is used to provide further understanding of the present utility model, and schematic description and description of the present utility model, for explaining the utility model, is not formed improper restriction of the present utility model.In the accompanying drawings:
Fig. 1 is the process flow diagram of the measurement analytic method of unmanned plane independent landing according to the utility model embodiment;
Fig. 2 is the Test Field schematic diagram of the measurement analytic method of unmanned plane independent landing according to the utility model embodiment;
Fig. 3 is the process flow diagram of the measurement analytic method of unmanned plane independent landing according to the utility model preferred embodiment;
Fig. 4 is the process flow diagram of the unmanned plane independent landing control method according to the utility model embodiment;
Fig. 5 is the schematic diagram of the measurement resolver of unmanned plane independent landing according to the utility model embodiment; And
Fig. 6 is the schematic diagram of the control device of unmanned plane independent landing according to the utility model embodiment.
Embodiment
It should be noted that, when not conflicting, the embodiment in the application and the feature in embodiment can combine mutually.Below with reference to the accompanying drawings and describe the utility model in detail in conjunction with the embodiments.
In order to make those skilled in the art person understand the application's scheme better, below in conjunction with the accompanying drawing in the embodiment of the present application, the technical scheme in the embodiment of the present application is clearly and completely described.
Fig. 1 is the process flow diagram of the measurement analytic method of unmanned plane independent landing according to the utility model embodiment.As shown in Figure 1, the method comprises the following steps:
Step S11, obtains the stereo image information treating touchdown area.
Unmanned plane is provided with camera head, and such as, video camera, this camera head is 3D camera head, for taking the 3D rendering treating touchdown area, thus obtains the stereo image information treating touchdown area.
Fig. 2 is the Test Field schematic diagram of the measurement analytic method of unmanned plane independent landing according to the utility model embodiment.As shown in Figure 2, in this embodiment, unmanned plane is provided with video camera 1, UAV2, barrier 3 and on-the-spot 4.This scene 4 is for treating the scene of touchdown area.For video camera 1, preferably, pinhole camera 31 is adopted.
Video camera 1 and UAV2 can take the mode that is fixedly connected with to be fixed to be connected.When installing video camera 1, the primary optical axis of video camera 1 perpendicular to UAV body and point to down, that is, for take UAV body immediately below.Like this, near UAV2 and landing point, the relative position relation Solve problems of position can be converted into the relation solving camera coordinate system and characteristic pattern coordinate system.
UAV2 can be the aircraft of lift-launch chip, as small-sized depopulated helicopter, four rotors, six rotors and eight rotors.UAV2 is equipped with video camera 1, and is loaded with the chip with computing power, and the detection algorithm of 3D is integrated in the chip of UAV2, and this chip may be used for taking to video camera 1 3D rendering obtained and resolves.
Barrier 3 and 4, scene are the environment treated near landing point of UAV2, and preferably, the algorithms selection size of UAV2 than the large barrier of UAV2 size lands, or is landed at other solution spaces.When the size of barrier is too little, other landing points are selected to land according to selection algorithm.
Step S12, according to treating that the stereo image information of touchdown area obtains relative position information.
Relative position information is unmanned plane and the information treating the relative position treating landing point in touchdown area.
In order to obtain relative position information more accurately, to unmanned plane and can treat that touchdown area sets up coordinate system respectively, namely, the body coordinate of unmanned plane and for locating the actual coordinates treating to treat landing point in touchdown area, preferably, according to treating that the stereo image information acquisition relative position information of touchdown area can be that employing following methods obtains:
Obtain according to the stereo image information and actual coordinates for the treatment of touchdown area and treat multiple coordinate treating landing point in touchdown area.
The body coordinate of unmanned plane is obtained according to the body axis system of unmanned plane.
According to treat in touchdown area multiple treat landing point coordinate and the body coordinate of unmanned plane obtain relative position information.
Step S13, according to relative position information in the landing point treating touchdown area determination unmanned plane.
Preferably, according to relative position information before the landing point treating touchdown area determination unmanned plane, the method also comprises: according to treating that the stereo image information of touchdown area obtains relative attitude information.In that case, treating that the landing point of touchdown area determination unmanned plane comprises according to relative position information: according to relative position information and relative attitude information in the landing point treating touchdown area determination unmanned plane.
Preferably, according to treating that the stereo image information of touchdown area obtains relative position information and comprises: the translation-angle being tied to body axis system according to relative position algorithm, the actual position coordinate treating to treat in touchdown area landing point and actual coordinate calculates relative position.According to treating that the stereo image information of touchdown area obtains relative attitude information and comprises: the translation-angle being tied to body axis system according to relative attitude algorithm and actual coordinate calculates relative attitude.Treating that the landing point of touchdown area determination unmanned plane comprises according to relative position information and relative attitude information: according to the relative position calculated and the relative attitude that calculates from multiple landing point treating to select landing point unmanned plane.
Preferably, according to relative position information before the landing point treating touchdown area determination unmanned plane, the method also comprises: according to treating that the barrier in touchdown area is treated in the stereo image information identification of touchdown area.In that case, treating that the landing point of touchdown area determination unmanned plane comprises according to relative position information: obtain the size treating barrier in touchdown area.Whether the size of disturbance in judgement thing is greater than pre-set dimension, and wherein, pre-set dimension is greater than the size of the body of unmanned plane.If judge that the size of barrier is greater than pre-set dimension, then according to relative position information using the landing point of barrier as unmanned plane.If the size of barrier is less than or equal to pre-set dimension, then according to relative position information from the landing point treating to select other landing points outside barrier as unmanned plane touchdown area.
Fig. 3 is the process flow diagram of the measurement analytic method of unmanned plane independent landing according to the utility model preferred embodiment.In the embodiment of shown in this figure, the optics 3D showing a kind of UAV independent landing measures the Test Field of analytic method.Measure analytic method below in conjunction with this figure to the optics 3D of UAV independent landing of the present utility model to be described in detail.Analytic method is measured with the optics 3D more clearly being described UAV independent landing of the present utility model by the preferred embodiment.
Often as shown in Figure 3, the method comprises the following steps:
Taken the 3D rendering treating touchdown area by video camera 1, thus obtain the stereo image information treating touchdown area.
Video camera 1, as input equipment, as the sensor of UAV, when resolving, according to geometrical perspective projection theory, adopts video camera pin-hole model to resolve.
Dissection process is carried out by MCU32 stereoscopic image information.
MCU32 is then controller and the solver of whole control device or resolver.Higher level lanquage can be adopted to realize, and what now adopt is embedded chip, as ARM chip or dsp chip.Also hardware language can be adopted to realize, as fpga chip.
The dissection process of MCU32 stereoscopic image information includes but not limited to following process:
Relative position resolves 33 for computation, and the translation-angle being tied to UAV body axis system according to the coordinate of physical location and actual coordinate calculates relative position, carries out dissection process together with can resolving 34 with relative attitude.
Relative attitude resolves 34 for computation, and the translation-angle that can be tied to UAV body axis system according to actual coordinate carries out calculating relative attitude.Dissection process is carried out together with can resolving 33 with relative position.
Position selection algorithm 35 is position selection algorithm, can according to relative position resolve 33 and relative attitude resolve 34 results calculated landing point selected, select the place of grease it in land.And the instruction sending how movement is moved to motor 36, moves to suitable place and lands.The module performing this algorithm also can embed in MCU32.
36, motor is the travel mechanism of UAV, is also the topworks of MCU32.Automatic Landing and the hovering of UAV can be controlled by motor 36.
The optics 3D of the UAV independent landing that this embodiment provides measures analytic method and comprises: relative position resolves and relative attitude calculation method.By UAV, the site environment of landing point is detected, and calculate relative position and relative attitude information, and carry out selecting suitable position to land.When finding the barrier of improper landing, carry out selecting suitable place to land.The automatic Landing of the utility model embodiment also comprises auto landing on deck.
In this embodiment, the body of UAV can be the carrier in kind of computation MCU, and calculation method can be software algorithm, and higher level lanquage or hardware language also can be adopted to realize.
In this embodiment, the body of UAV can for being loaded with the UAV body of pinhole camera and intelligent chip.
Intelligent chip can adopt embedded chip to design, and can embed intelligent algorithm, and hardware design language also can be adopted to realize.
Intelligent chip may be used for performing the calculation method in the utility model embodiment.This calculation method comprises that relative position resolves, relative attitude resolves and position selection algorithm.Intelligent chip can adopt embedded chip to design, and can embed intelligent algorithm, and hardware design language also can be adopted to realize.
For UAV Landing coordinates of targets, X o, Y o, Z ofor unmanned plane actual position coordinate, α, beta, gamma be unmanned plane respectively around X, Y, the Z axis anglec of rotation.
The relative attitude clearing principle that the utility model embodiment adopts is as follows:
Utilize and rotate Z-X-Z order, first around Z axis conversion, draw rotation matrix, then rotate around X-axis, obtain rotation matrix, finally, rotate around Z' axle, draw rotation matrix, spin step is:
1 in XYZ space coordinate system, and make XOY face carry out α (-π, π) angle around Z axis and rotate, obtain X', Y', Z axis, arrive rotation matrix is A simultaneously;
2 in X', Y' and Z coordinate system, makes X'OZ face carry out β (0, π) angle around Y' axle and rotates, obtain new X ", Y' and Z' axle, arrives rotation matrix B simultaneously;
3 at X ", in Y', Z' coordinate system, make X " OY' face carries out γ (-π, π) angle around Z' axle and rotates, and obtains X " ', Y " and Z' axle, arrives rotation matrix C simultaneously;
Then by above-mentioned rotation matrix A, B, C carry out the posture changing of unmanned plane landing and landing point.
D=A*B*C
Draw thus:
Resolve according to above-mentioned relative position, the result after its relative position (coordinates of three coordinate axis) derivation is:
Relative attitude is settled accounts:
The calculating at relative rotation attitude angle has been shown in said process, because UAV presents X-type pattern in flight course, but will have presented when land cross, therefore landing point and the UAV body relative attitude anglec of rotation are 45 degree.Rotating relative attitude matrix is E=D*F, wherein:
In flight course, use Eulerian angle principle, fight dynamics equation is H, and relative attitude angle corresponds to crab angle, roll angle, and the angle of pitch three Eulerian angle are θ, ψ, φ,
According to the known E=H of flying power principle; Utilize matrix principle, matrix is equal, then corresponding element is equal.Derive:
Inverse trigonometric function is utilized to draw its Euler angle:
Above-mentioned position selection algorithm, preferably, can treat that the minimum value of the height difference of landing point is selected according to the n of UAV, select the landing point of UAV:
Fig. 4 is the process flow diagram of the unmanned plane independent landing control method according to the utility model embodiment.The method comprises the following steps:
Step S41, is treating the landing point determining unmanned plane in touchdown area.
The measurement analytic method that this step is provided by the utility model is in the landing point treating to determine in touchdown area unmanned plane.
Such as, carry out measurement by the measurement analytic method of unmanned plane independent landing embodiment illustrated in fig. 1 to resolve, to treat the landing point determining unmanned plane in touchdown area.
Step S42, controls the landing point that unmanned plane is being determined and lands.
After the landing point determining unmanned plane, just can land according to the landing point determined.Such as, the automatic Landing of motor 36 control UAV is controlled.
It should be noted that, can perform in the computer system of such as one group of computer executable instructions in the step shown in the process flow diagram of accompanying drawing, and, although show logical order in flow charts, but in some cases, can be different from the step shown or described by order execution herein.
Fig. 5 is the schematic diagram of the measurement resolver of unmanned plane independent landing according to the utility model embodiment.
This device comprises:
First acquiring unit 51, for obtaining the stereo image information treating touchdown area.
Second acquisition unit 52, for treating described in basis that the stereo image information of touchdown area obtains relative position information, wherein, described relative position information is unmanned plane and the described information treating the relative position treating landing point in touchdown area.
Described, determining unit 53, for treating that touchdown area determines the landing point of described unmanned plane according to described relative position information.
Preferably, described second acquisition unit 52 comprises:
First acquisition module, treats multiple coordinate treating landing point in touchdown area described in stereo image information and actual coordinates for treating touchdown area described in basis obtain.
Second acquisition module, for obtaining the body coordinate of described unmanned plane according to the body axis system of described unmanned plane.
3rd acquisition module, multiplely in touchdown area treats that the coordinate of landing point and the body coordinate of described unmanned plane obtain described relative position information for treating described in basis.
Preferably, described device also comprises:
3rd acquiring unit, for according to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, treat that the stereo image information of touchdown area obtains relative attitude information according to described, wherein, described determining unit is used for treating that touchdown area determines the landing point of described unmanned plane according to described relative position information and described relative attitude information described.
Preferably, described second acquisition unit 52 for according to relative position algorithm, described in treat to treat in touchdown area that the translation-angle that the actual position coordinate of landing point and described actual coordinate are tied to described body axis system calculates relative position.The translation-angle that described 3rd acquiring unit is used for being tied to according to relative attitude algorithm and described actual coordinate described body axis system calculates relative attitude.Described determining unit 53 for the relative position that calculates described in basis and the relative attitude that calculates from described multiple landing point treating to select landing point described unmanned plane.
Preferably, described device also comprises: recognition unit, for according to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, according to the described stereo image information identification treating touchdown area, treat the barrier in touchdown area.Described determining unit 53 comprises: acquisition module, for treating the size of barrier in touchdown area described in obtaining.Judge module, for judging whether the size of described barrier is greater than pre-set dimension, wherein, described pre-set dimension is greater than the size of the body of described unmanned plane.Determination module, for when judging that the size of described barrier is greater than described pre-set dimension, then according to described relative position information using the landing point of described barrier as described unmanned plane.When judging that the size of described barrier is less than or equal to described pre-set dimension, according to described relative position information from the described landing point treating to select other landing points outside described barrier as described unmanned plane touchdown area.
Fig. 6 is the schematic diagram of the unmanned plane independent landing control device according to the utility model embodiment.As shown in Figure 6, this unmanned plane independent landing control device comprises:
Measure resolution unit 61, for treating the landing point determining unmanned plane in touchdown area, wherein, the measurement analytic method provided according to the utility model is in the landing point treating to determine in touchdown area unmanned plane.
Control module 62, lands for controlling the landing point that unmanned plane is being determined.
Obviously, those skilled in the art should be understood that, above-mentioned of the present utility model each module or each step can realize with general calculation element, they can concentrate on single calculation element, or be distributed on network that multiple calculation element forms, alternatively, they can realize with the executable program code of calculation element, thus, they can be stored and be performed by calculation element in the storage device, or they are made into each integrated circuit modules respectively, or the multiple module in them or step are made into single integrated circuit module to realize.Like this, the utility model is not restricted to any specific hardware and software combination.
The foregoing is only preferred embodiment of the present utility model, be not limited to the utility model, for a person skilled in the art, the utility model can have various modifications and variations.All within spirit of the present utility model and principle, any amendment done, equivalent replacement, improvement etc., all should be included within protection domain of the present utility model.

Claims (4)

1. a measurement resolver for unmanned plane independent landing, is characterized in that, comprising:
First acquiring unit, for obtaining the stereo image information treating touchdown area;
Second acquisition unit, for treating described in basis that the stereo image information of touchdown area obtains relative position information, wherein, described relative position information is unmanned plane and the described information treating the relative position treating landing point in touchdown area; And
Described, determining unit, for treating that touchdown area determines the landing point of described unmanned plane according to described relative position information.
2. device according to claim 1, is characterized in that, described second acquisition unit comprises:
First acquisition module, treats multiple coordinate treating landing point in touchdown area described in stereo image information and actual coordinates for treating touchdown area described in basis obtain;
Second acquisition module, for obtaining the body coordinate of described unmanned plane according to the body axis system of described unmanned plane; And
3rd acquisition module, multiplely in touchdown area treats that the coordinate of landing point and the body coordinate of described unmanned plane obtain described relative position information for treating described in basis.
3. device according to claim 1, is characterized in that,
Described device also comprises: recognition unit, for according to described relative position information described treat that touchdown area determines the landing point of described unmanned plane before, according to the described stereo image information identification treating touchdown area, treat the barrier in touchdown area,
Described determining unit comprises:
Acquisition module, for treating the size of barrier in touchdown area described in obtaining;
Judge module, for judging whether the size of described barrier is greater than pre-set dimension, wherein, described pre-set dimension is greater than the size of the body of described unmanned plane;
Determination module, for when judging that the size of described barrier is greater than described pre-set dimension, then according to described relative position information using the landing point of described barrier as described unmanned plane; When judging that the size of described barrier is less than or equal to described pre-set dimension, according to described relative position information from the described landing point treating to select other landing points outside described barrier as described unmanned plane touchdown area.
4. a unmanned plane independent landing control device, is characterized in that, comprising:
Measuring resolution unit, for treating the landing point determining unmanned plane in touchdown area, comprising the measurement resolver according to any one of claims 1 to 3; And
Control module, lands for controlling the landing point that described unmanned plane is being determined.
CN201520652253.7U 2015-08-27 2015-08-27 Measurement resolver and controlling means that unmanned aerial vehicle independently landed Withdrawn - After Issue CN205028160U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201520652253.7U CN205028160U (en) 2015-08-27 2015-08-27 Measurement resolver and controlling means that unmanned aerial vehicle independently landed

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520652253.7U CN205028160U (en) 2015-08-27 2015-08-27 Measurement resolver and controlling means that unmanned aerial vehicle independently landed

Publications (1)

Publication Number Publication Date
CN205028160U true CN205028160U (en) 2016-02-10

Family

ID=55260679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520652253.7U Withdrawn - After Issue CN205028160U (en) 2015-08-27 2015-08-27 Measurement resolver and controlling means that unmanned aerial vehicle independently landed

Country Status (1)

Country Link
CN (1) CN205028160U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105204515A (en) * 2015-08-27 2015-12-30 泉州装备制造研究所 Measurement parsing method and apparatus of autonomous landing of unmanned aerial vehicle, and control method and apparatus of autonomous landing of unmanned aerial vehicle
CN106557089A (en) * 2016-11-21 2017-04-05 北京中飞艾维航空科技有限公司 A kind of control method and device of unmanned plane independent landing

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105204515A (en) * 2015-08-27 2015-12-30 泉州装备制造研究所 Measurement parsing method and apparatus of autonomous landing of unmanned aerial vehicle, and control method and apparatus of autonomous landing of unmanned aerial vehicle
CN105204515B (en) * 2015-08-27 2018-04-10 泉州装备制造研究所 The measurement parsing of unmanned plane independent landing and control method and device
CN106557089A (en) * 2016-11-21 2017-04-05 北京中飞艾维航空科技有限公司 A kind of control method and device of unmanned plane independent landing
CN106557089B (en) * 2016-11-21 2019-11-01 北京中飞艾维航空科技有限公司 A kind of control method and device of unmanned plane independent landing

Similar Documents

Publication Publication Date Title
CN105204515B (en) The measurement parsing of unmanned plane independent landing and control method and device
Hu et al. Object traversing by monocular UAV in outdoor environment
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
CN105021184B (en) It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation
CN110609290B (en) Laser radar matching positioning method and device
EP3407294A1 (en) Information processing method, device, and terminal
CN106774410A (en) Unmanned plane automatic detecting method and apparatus
Yu et al. A UAV-based crack inspection system for concrete bridge monitoring
CN114415736B (en) Multi-stage visual accurate landing method and device for unmanned aerial vehicle
Brockers et al. Autonomous landing and ingress of micro-air-vehicles in urban environments based on monocular vision
Masselli et al. A cross-platform comparison of visual marker based approaches for autonomous flight of quadrocopters
Masselli et al. A novel marker based tracking method for position and attitude control of MAVs
Fan et al. Vision algorithms for fixed-wing unmanned aerial vehicle landing system
CN103697883B (en) A kind of aircraft horizontal attitude defining method based on skyline imaging
CN205028160U (en) Measurement resolver and controlling means that unmanned aerial vehicle independently landed
Premachandra et al. A study on hovering control of small aerial robot by sensing existing floor features
CN114488848A (en) Unmanned aerial vehicle autonomous flight system and simulation experiment platform for indoor building space
CN108225273A (en) A kind of real-time runway detection method based on sensor priori
Leichtfried et al. Autonomous flight using a smartphone as on-board processing unit in GPS-denied environments
Zhuang et al. Method of pose estimation for UAV landing
Olivares-Mendez et al. Autonomous landing of an unmanned aerial vehicle using image-based fuzzy control
Liu et al. Noncooperative target detection of spacecraft objects based on artificial bee colony algorithm
Wang et al. Visual pose measurement based on structured light for MAVs in non-cooperative environments
EP3501984A1 (en) A method for validating sensor units in a uav, and a uav
CN109542120A (en) The method and device that target object is tracked by unmanned plane

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
AV01 Patent right actively abandoned

Granted publication date: 20160210

Effective date of abandoning: 20180410

AV01 Patent right actively abandoned