CN102538782A - Helicopter landing guide device and method based on computer vision - Google Patents

Helicopter landing guide device and method based on computer vision Download PDF

Info

Publication number
CN102538782A
CN102538782A CN2012100005932A CN201210000593A CN102538782A CN 102538782 A CN102538782 A CN 102538782A CN 2012100005932 A CN2012100005932 A CN 2012100005932A CN 201210000593 A CN201210000593 A CN 201210000593A CN 102538782 A CN102538782 A CN 102538782A
Authority
CN
China
Prior art keywords
module
helicopter
particle
coordinate
mentioned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012100005932A
Other languages
Chinese (zh)
Other versions
CN102538782B (en
Inventor
郑翰
李平
郑晓平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201210000593.2A priority Critical patent/CN102538782B/en
Publication of CN102538782A publication Critical patent/CN102538782A/en
Application granted granted Critical
Publication of CN102538782B publication Critical patent/CN102538782B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a helicopter landing guide device and method based on a computer vision. The device comprises a vision sensor system with three ground cameras, a foreground searching module, a tracking and matching module, a position and posture resolving module and a wireless data transmission module, wherein the vision sensor system with the three ground cameras, the foreground searching module, the tracking and matching module, the position and posture resolving module and the wireless data transmission module are sequentially connected with one another; and the vision sensor system with the three ground cameras is connected with the tracking and matching module. Compared with the traditional vision guide landing scheme, a background of an image is the monotonous sky, so that the interference of the background is effectively reduced and the noise is reduced; and meanwhile, an image processing task can be finished by a ground high-performance computer, and the instantaneity and the calculation precision are effectively improved compared with an airborne computer.

Description

A kind of helicopter landing guiding device and method based on computer vision
Technical field
The present invention relates to the aircraft navigation technical field, the device that relates in particular to a kind of depopulated helicopter landing visual guide method and realize this method.
Background technology
The independent landing of microminiature depopulated helicopter is very complicated mission phase; It requires under the situation of low speed and low clearance; Obtain accurate height, displacement information such as speed and course, and feed back to airborne flight control system; Position and attitude with the control aircraft drop to assigned address.
Traditional unmanned plane landing guidance method is the data fusion of airborne GPS (GPS), inertial navigation system (INS) and electronic compass to be obtained the posture information of helicopter.But owing to be subject to the precision of Airborne GPS system, and the noise of landing, reason such as block, the GPS system can't provide enough posture information accurately.And INS and electronic compass often exist bigger integral error, also locating information can't be provided exactly independently.
To this problem, occur utilizing vision sensor to take surface mark in recent years to obtain the landing guidance scheme of helicopter posture information.But visible sensation method needs that the image of gathering is carried out complex calculations to be handled, to extract above-mentioned posture information.And be subject to the size and the lifting capacity of helicopter, and the performance of airborne computer system is limited, and this just means and has big problem aspect computing real-time and the operational precision.In addition, the flating of airborne vision sensor various degrees when obtaining ground image, smog such as blocks at problem, so still be necessary to seek new more perfect helicopter landing scheme.
Summary of the invention
For real-time and the precision that improves the aircraft pose measurement; The invention provides a kind of with the terminal plastics buffering ball of helicopter landing gear as the characteristic angle point; Utilize vision sensor real-time follow-up angle point; And obtain the helicopter posture information through image processing algorithm, with the apparatus and method of guiding helicopter independent accuracy.
The objective of the invention is to realize: a kind of helicopter landing guiding device through following technical scheme; It comprises: ground three camera visual sensor systems, prospect search module, tracking and matching module, pose resolve module and wireless data transfer module etc.; Said ground three camera visual sensor systems, prospect search module, tracking and matching module, pose resolve module and link to each other successively with wireless data transfer module, and said ground three camera visual sensor systems link to each other with the tracking and matching module.
A kind of depopulated helicopter landing visual guide method comprises the steps:
(1) ground three camera visual sensor system images acquired;
(2) searching image prospect in the image that step 1 is gathered, promptly color is respectively four red, yellow, blue, green airborne artificial terrestrial references;
(3) use the above-mentioned artificial terrestrial reference of particle filter algorithm real-time follow-up;
(4) resolve module by pose and calculate the helicopter relatively position and the attitude of coordinate system;
(5) posture information that step 4 obtained through wireless data transfer module of land station's computing machine is sent to the receiver that is connected to flight-control computer, is transferred to flight-control computer through RS-232 interface then;
(6) flight-control computer drops to assigned address according to above-mentioned posture information guiding helicopter.
The invention has the beneficial effects as follows; The present invention is provided with artificial terrestrial reference owing to having adopted on the helicopter body; By the scheme that ground multi-cam visual sensor system is taken aerial terrestrial reference, than the scheme that helicopter-mounted vision sensor is taken the ground artificial terrestrial reference, the background of image is dull sky; Reduce the interference of background effectively, reduced noise; Simultaneously, accomplish the Flame Image Process task,, effectively promote real-time and computational accuracy than airborne computer by the ground high-performance computer.In addition, this programme adopts the multi-cam vision system to merge from three angle acquisition image informations, can effectively solve the occlusion issue between terrestrial reference, has increased the measurement space scope of system.The present invention only increases by four artificiallies and is marked with and three cameras on original hardware foundation, made full use of original land station computing machine and wireless data transfer module, has effectively controlled cost.
Description of drawings
Below in conjunction with accompanying drawing and embodiment the present invention is further specified:
Fig. 1 is a helicopter landing apparatus function frame synoptic diagram of the present invention;
Fig. 2 is ground three camera visual sensor system structural drawing;
Fig. 3 is a prospect search module software flow pattern;
Fig. 4 is a tracking and matching module software process flow diagram;
Fig. 5 is that pose resolves the module software process flow diagram;
Fig. 6 is the schematic flow sheet of helicopter landing method of the present invention.
Embodiment
Fig. 1 is the example of the functional block diagram of helicopter landing guiding device involved in the present invention.
As shown in Figure 1; Helicopter landing guiding device of the present invention comprises: ground three camera visual sensor systems 1, prospect search module 2, tracking and matching module 3, pose resolve module 4 and wireless data transfer module 5; Ground three camera visual sensor systems 1, prospect search module 2, tracking and matching module 3, pose resolve module 4 and link to each other successively with wireless data transfer module 5, and ground three camera visual sensor systems 1 link to each other with tracking and matching module 3.
As shown in Figure 2, ground three camera visual sensor systems comprise three optics cameras, and its annexation is as shown in Figure 2.The optics camera can adopt the C160 product of Logitech Company, but is not limited thereto.
As shown in Figure 3, the course of work of prospect search module is following: transfer the image of vision sensor output to the HSV territory by the RGB territory; Detect required color region through screening, promptly pairing red, yellow, blue, territory, Green Zone is marked in four artificiallies; Respectively above-mentioned each color region is turned into that inverse is promptly green, blue, yellow, red color; Above-mentioned two width of cloth images are converted into the Gray image with the removal noise, and do difference operation, thereby eliminate background; Use the Canny operator to seek the ball edge to above-mentioned image subsequently; Use Hough transformation at last and find the solution above-mentioned four radius of a circles and the coordinate of the center of circle in image.This module can adopt C Plus Plus to call the OpenCV built-in function and realize, but is not limited thereto.
Above-mentioned artificially is designated as and is fixed on four terminal rigid plastic buffering balls of helicopter landing gear, and it is with respect to the centroid position process demarcation of helicopter.The subsidiary landing gear of T-REX 600 Nitro Super Pro helicopters that above-mentioned artificial terrestrial reference can adopt Japanese Futaba company to produce, but be not limited only to this.
As shown in Figure 4, the course of work of tracking and matching module is following: set number of particles, confirm that motion model is the single order adaptive model; With detecting to such an extent that four terrestrial references are set at tracking target among the S2; And use the weighting color histogram to set up object module: to mark to above-mentioned everyone building site; Make that target is that projection circle centre coordinate is y, it is m that histogram quantizes progression, and tracing area interior pixel coordinate is { x i} I=1,2 ..., N, set up weighting color histogram object module p={p u(y) } U=1,2 .., m, wherein:
Figure 2012100005932100002DEST_PATH_IMAGE001
Figure 4712DEST_PATH_IMAGE002
Wherein,
Figure 2012100005932100002DEST_PATH_IMAGE003
is the kernel function bandwidth;
Figure 776358DEST_PATH_IMAGE004
is the section function corresponding to kernel function; Function b is divided into pixel among the histogrammic response bin according to its color value;
Figure 2012100005932100002DEST_PATH_IMAGE005
is the Kronecker function; U is that histogram quantizes progression;
And set up altogether N particle of original state particle collection:
Figure 789314DEST_PATH_IMAGE006
Wherein, The central coordinate of circle and the radius information of this terrestrial reference projection circle in
Figure 2012100005932100002DEST_PATH_IMAGE007
expression the 0th frame moment, the i particle, and the initial weight value of each particle all is 1/N.In subsequent image frames, carry out system state and shift propagating particle at random, and find the solution the particle weights; Ask for the coordinate position of above-mentioned each target in the picture plane after the particle weighting; The output coordinate positional value; Resample the redistribution particle position then.This module can adopt C Plus Plus to realize, but is not limited thereto.
As shown in Figure 5, pose resolves the modular design implementation procedure and comprises the steps: that the transformational relation of known helicopter coordinate system and camera coordinate system is as follows:
Figure 484869DEST_PATH_IMAGE008
Wherein, (x c, y c, z c) (x h, y h, z h) represent that respectively the terrestrial reference center is in homogeneous coordinates under the camera coordinate system and the homogeneous coordinates under the helicopter coordinate system; Find the solution rotation matrix R and the transposed matrix t of helicopter coordinate system with the iterative least square algorithm with respect to the camera coordinate origin; Through resolving crab angle, roll angle and the angle of pitch that rotation matrix can obtain helicopter; Through resolving that transposed matrix can obtain the elevation information of helicopter and with respect to the range information in level point.
The APC200A-43 module that wireless data transfer module can adopt An Meitong Science and Technology Ltd. to produce, but be not limited only to this.
As shown in Figure 6, depopulated helicopter landing visual guide method of the present invention comprises the steps:
Incipient stage, suppose that helicopter is under the guiding of GPS, in the effective field of view scope of the ground visual sensor system that flies.Flight-control computer switches to the helicopter landing guiding device that the present invention relates to, so that the required high precision position and posture information of landing to be provided.
1. ground three camera visual sensor system images acquired.
The real-time images acquired of ground three camera visual sensor systems, and the image of gathering is sent to the prospect search module.
2. searching image prospect in the image that step 1 is gathered, promptly color is respectively four red, yellow, blue, green airborne artificial terrestrial references.
Concrete operations are following: the prospect search module transfers the image of ground three camera visual sensor systems output to the HSV territory by the RGB territory; Detect required color region through screening, promptly pairing red, yellow, blue, territory, Green Zone is marked in four artificiallies; Respectively above-mentioned each color region is turned into that inverse is promptly green, blue, yellow, red color; Above-mentioned two width of cloth images are converted into the Gray image with the removal noise, and do difference operation, thereby eliminate background; Use the Canny operator to seek the ball edge to above-mentioned image subsequently; Use Hough transformation at last and find the solution above-mentioned four radius of a circles and the coordinate of the center of circle in image.
3. use the above-mentioned artificial terrestrial reference of particle filter algorithm real-time follow-up
Concrete operations are following: in the tracking and matching module, set number of particles, confirm that motion model is the single order adaptive model; Detected four terrestrial references in the step 2 are set at tracking target; And use the weighting color histogram to set up object module: to mark to above-mentioned everyone building site; Make that target is that projection circle centre coordinate is y, it is m that histogram quantizes progression, and tracing area interior pixel coordinate is { x i} I=1,2 ..., N, set up weighting color histogram object module p={p u(y) } U=1,2 .., m, wherein:
Figure 837352DEST_PATH_IMAGE001
Figure 842218DEST_PATH_IMAGE002
Wherein,
Figure 280152DEST_PATH_IMAGE003
is the kernel function bandwidth;
Figure 513819DEST_PATH_IMAGE004
is the section function corresponding to kernel function; Function b is divided into pixel among the histogrammic response bin according to its color value; is the Kronecker function; U is that histogram quantizes progression;
And set up altogether N particle of original state particle collection
Figure 99838DEST_PATH_IMAGE006
Wherein representes the central coordinate of circle and the radius information of this terrestrial reference projection circle in the 0th frame moment, the i particle, and the initial weight value of each particle all is 1/N.In subsequent image frames, carry out system state and shift propagating particle at random, and find the solution the particle weights; Ask for the coordinate position of above-mentioned each target in the picture plane after the particle weighting; The output coordinate positional value; Resample the redistribution particle position then.
4. resolve module by pose and calculate the helicopter relatively position and the attitude of coordinate system.
Concrete operations are following: the transformational relation of known helicopter coordinate system and camera coordinate system is as follows:
Figure 311693DEST_PATH_IMAGE008
Wherein, (x c, y c, z c) (x h, y h, z h) represent that respectively the terrestrial reference center is in homogeneous coordinates under the camera coordinate system and the homogeneous coordinates under the helicopter coordinate system; Find the solution rotation matrix R and the transposed matrix t of helicopter coordinate system with the iterative least square algorithm with respect to the camera coordinate origin; Through resolving crab angle, roll angle and the angle of pitch that rotation matrix can obtain helicopter; Through resolving that transposed matrix can obtain the elevation information of helicopter and with respect to the range information in level point.
5. land station's computing machine is sent to the receiver that is connected to flight-control computer through wireless data transfer module with above-mentioned posture information, is transferred to flight-control computer through RS-232 interface then.
6. flight-control computer drops to assigned address according to above-mentioned posture information guiding helicopter.
Can realize safe accuracy according to the above-mentioned steps depopulated helicopter.

Claims (5)

1. helicopter landing guiding device; It is characterized in that; It comprises: ground three camera visual sensor systems, prospect search module, tracking and matching module, pose resolve module and wireless data transfer module etc.; Said ground three camera visual sensor systems, prospect search module, tracking and matching module, pose resolve module and link to each other successively with wireless data transfer module, and said ground three camera visual sensor systems link to each other with the tracking and matching module.
2. an application rights requires the depopulated helicopter landing visual guide method of 1 said helicopter landing guiding device, it is characterized in that this method comprises the steps:
(1) ground three camera visual sensor system images acquired;
(2) searching image prospect in the image that step 1 is gathered, promptly color is respectively four red, yellow, blue, green airborne artificial terrestrial references;
(3) use the above-mentioned artificial terrestrial reference of particle filter algorithm real-time follow-up;
(4) resolve module by pose and calculate the helicopter relatively position and the attitude of coordinate system;
(5) posture information that step 4 obtained through wireless data transfer module of land station's computing machine is sent to the receiver that is connected to flight-control computer, is transferred to flight-control computer through RS-232 interface then;
(6) flight-control computer drops to assigned address according to above-mentioned posture information guiding helicopter.
3. depopulated helicopter landing visual guide method according to claim 2 is characterized in that said step 2 is specially: transfer the image of ground three camera visual sensor systems output to the HSV territory by the RGB territory; Detect required color region through screening, promptly pairing red, yellow, blue, territory, Green Zone is marked in four artificiallies; Respectively above-mentioned each color region is turned into that inverse is promptly green, blue, yellow, red color; Above-mentioned two width of cloth images are converted into the Gray image with the removal noise, and do difference operation, thereby eliminate background; Use the Canny operator to seek the ball edge to above-mentioned image subsequently; Use Hough transformation at last and find the solution above-mentioned four radius of a circles and the coordinate of the center of circle in image.
4. depopulated helicopter landing visual guide method according to claim 2 is characterized in that said step 3 is specially: set number of particles, confirm that motion model is the single order adaptive model; Detected four terrestrial references in the step 2 are set at tracking target; And use the weighting color histogram to set up object module: to mark to above-mentioned everyone building site; Make that target is that projection circle centre coordinate is y, it is m that histogram quantizes progression, and tracing area interior pixel coordinate is { x i} I=1,2 ..., N, set up weighting color histogram object module: p={p u(y) } U=1,2 .., m, wherein:
Figure 673914DEST_PATH_IMAGE001
Figure 103758DEST_PATH_IMAGE002
Wherein,
Figure 407701DEST_PATH_IMAGE003
is the kernel function bandwidth; is the section function corresponding to kernel function; Function b is divided into pixel among the histogrammic response bin according to its color value;
Figure 136415DEST_PATH_IMAGE005
is the Kronecker function; U is that histogram quantizes progression;
And set up altogether N particle of original state particle collection:
Wherein
Figure 262820DEST_PATH_IMAGE007
representes the central coordinate of circle and the radius information of this terrestrial reference projection circle in the 0th frame moment, the i particle, and the initial weight value of each particle all is 1/N; In subsequent image frames, carry out system state and shift propagating particle at random, and find the solution the particle weights; Ask for the coordinate position of above-mentioned each target in the picture plane after the particle weighting; The output coordinate positional value; Resample the redistribution particle position then.
5. depopulated helicopter landing visual guide method according to claim 2 is characterized in that said step 5 is specially: the transformational relation of known helicopter coordinate system and camera coordinate system is as follows:
Figure 341634DEST_PATH_IMAGE008
Wherein, (x c, y c, z c) (x h, y h, z h) represent that respectively the terrestrial reference center is in homogeneous coordinates under the camera coordinate system and the homogeneous coordinates under the helicopter coordinate system; Find the solution rotation matrix R and the transposed matrix t of helicopter coordinate system with the iterative least square algorithm with respect to the camera coordinate origin; Through resolving crab angle, roll angle and the angle of pitch that rotation matrix can obtain helicopter; Through resolving that transposed matrix can obtain the elevation information of helicopter and with respect to the range information in level point.
CN201210000593.2A 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision Expired - Fee Related CN102538782B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210000593.2A CN102538782B (en) 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210000593.2A CN102538782B (en) 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision

Publications (2)

Publication Number Publication Date
CN102538782A true CN102538782A (en) 2012-07-04
CN102538782B CN102538782B (en) 2014-08-27

Family

ID=46346273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210000593.2A Expired - Fee Related CN102538782B (en) 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision

Country Status (1)

Country Link
CN (1) CN102538782B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
CN103413466A (en) * 2013-07-08 2013-11-27 中国航空无线电电子研究所 Airborne visible ground guide and warning device and guide and warning method thereof
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105068548A (en) * 2015-08-12 2015-11-18 北京贯中精仪科技有限公司 Landing guide system of unmanned aerial vehicle
CN105182994A (en) * 2015-08-10 2015-12-23 普宙飞行器科技(深圳)有限公司 Unmanned-aerial-vehicle fixed-point landing method
CN105676875A (en) * 2015-03-10 2016-06-15 张超 Automatic landing system of unmanned aerial vehicle
CN105979119A (en) * 2016-06-02 2016-09-28 深圳迪乐普数码科技有限公司 Method for filtering infrared rocking arm tracking motion data and terminal
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN109521791A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Identification method and device based on earth station
CN109791413A (en) * 2016-10-10 2019-05-21 高通股份有限公司 For making system and method for the UAV Landing on mobile foundation
CN113154220A (en) * 2021-03-26 2021-07-23 苏州略润娇贸易有限公司 Easily-built computer for construction site and use method thereof
CN114812513A (en) * 2022-05-10 2022-07-29 北京理工大学 Unmanned aerial vehicle positioning system and method based on infrared beacon

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000243A (en) * 2007-01-16 2007-07-18 北京航空航天大学 Pilotless plane landing navigation method and its device
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
CN101833104A (en) * 2010-04-27 2010-09-15 北京航空航天大学 Three-dimensional visual navigation method based on multi-sensor information fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
CN101000243A (en) * 2007-01-16 2007-07-18 北京航空航天大学 Pilotless plane landing navigation method and its device
CN101833104A (en) * 2010-04-27 2010-09-15 北京航空航天大学 Three-dimensional visual navigation method based on multi-sensor information fusion

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
CN103413466A (en) * 2013-07-08 2013-11-27 中国航空无线电电子研究所 Airborne visible ground guide and warning device and guide and warning method thereof
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105068553A (en) * 2015-03-10 2015-11-18 无锡桑尼安科技有限公司 Unmanned aerial vehicle automatic landing system
CN105676875A (en) * 2015-03-10 2016-06-15 张超 Automatic landing system of unmanned aerial vehicle
CN105182994B (en) * 2015-08-10 2018-02-06 普宙飞行器科技(深圳)有限公司 A kind of method of unmanned plane pinpoint landing
CN105182994A (en) * 2015-08-10 2015-12-23 普宙飞行器科技(深圳)有限公司 Unmanned-aerial-vehicle fixed-point landing method
CN105068548A (en) * 2015-08-12 2015-11-18 北京贯中精仪科技有限公司 Landing guide system of unmanned aerial vehicle
CN105068548B (en) * 2015-08-12 2019-06-28 北京贯中精仪科技有限公司 UAV Landing guides system
CN105979119A (en) * 2016-06-02 2016-09-28 深圳迪乐普数码科技有限公司 Method for filtering infrared rocking arm tracking motion data and terminal
CN105979119B (en) * 2016-06-02 2019-07-16 深圳迪乐普数码科技有限公司 A kind of filtering method and terminal of infrared rocker arm pursuit movement data
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN107544550B (en) * 2016-06-24 2021-01-15 西安电子科技大学 Unmanned aerial vehicle automatic landing method based on visual guidance
CN109791413A (en) * 2016-10-10 2019-05-21 高通股份有限公司 For making system and method for the UAV Landing on mobile foundation
CN109521791A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Identification method and device based on earth station
CN113154220A (en) * 2021-03-26 2021-07-23 苏州略润娇贸易有限公司 Easily-built computer for construction site and use method thereof
CN114812513A (en) * 2022-05-10 2022-07-29 北京理工大学 Unmanned aerial vehicle positioning system and method based on infrared beacon

Also Published As

Publication number Publication date
CN102538782B (en) 2014-08-27

Similar Documents

Publication Publication Date Title
CN102538782B (en) Helicopter landing guide device and method based on computer vision
Martínez et al. On-board and ground visual pose estimation techniques for UAV control
Patruno et al. A vision-based approach for unmanned aerial vehicle landing
US11906983B2 (en) System and method for tracking targets
CN110222612B (en) Dynamic target identification and tracking method for autonomous landing of unmanned aerial vehicle
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN110221625B (en) Autonomous landing guiding method for precise position of unmanned aerial vehicle
US10133929B2 (en) Positioning method and positioning device for unmanned aerial vehicle
CN103822635A (en) Visual information based real-time calculation method of spatial position of flying unmanned aircraft
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN108520559B (en) Unmanned aerial vehicle positioning and navigation method based on binocular vision
CN103365297A (en) Optical flow-based four-rotor unmanned aerial vehicle flight control method
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN105644785A (en) Unmanned aerial vehicle landing method based on optical flow method and horizon line detection
US20200357141A1 (en) Systems and methods for calibrating an optical system of a movable object
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
Smyczyński et al. Autonomous drone control system for object tracking: Flexible system design with implementation example
CN108122255A (en) It is a kind of based on trapezoidal with circular combination terrestrial reference UAV position and orientation method of estimation
CN102788579A (en) Unmanned aerial vehicle visual navigation method based on SIFT algorithm
CN108225273A (en) A kind of real-time runway detection method based on sensor priori
Chen et al. Real-time geo-localization using satellite imagery and topography for unmanned aerial vehicles
CN110322462B (en) Unmanned aerial vehicle visual landing method and system based on 5G network
CN116578035A (en) Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
Martínez et al. Trinocular ground system to control UAVs

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140827

Termination date: 20150104

EXPY Termination of patent right or utility model