CN102538782A - Helicopter landing guide device and method based on computer vision - Google Patents

Helicopter landing guide device and method based on computer vision Download PDF

Info

Publication number
CN102538782A
CN102538782A CN2012100005932A CN201210000593A CN102538782A CN 102538782 A CN102538782 A CN 102538782A CN 2012100005932 A CN2012100005932 A CN 2012100005932A CN 201210000593 A CN201210000593 A CN 201210000593A CN 102538782 A CN102538782 A CN 102538782A
Authority
CN
China
Prior art keywords
helicopter
module
ground
landing
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012100005932A
Other languages
Chinese (zh)
Other versions
CN102538782B (en
Inventor
郑翰
李平
郑晓平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201210000593.2A priority Critical patent/CN102538782B/en
Publication of CN102538782A publication Critical patent/CN102538782A/en
Application granted granted Critical
Publication of CN102538782B publication Critical patent/CN102538782B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a helicopter landing guide device and method based on a computer vision. The device comprises a vision sensor system with three ground cameras, a foreground searching module, a tracking and matching module, a position and posture resolving module and a wireless data transmission module, wherein the vision sensor system with the three ground cameras, the foreground searching module, the tracking and matching module, the position and posture resolving module and the wireless data transmission module are sequentially connected with one another; and the vision sensor system with the three ground cameras is connected with the tracking and matching module. Compared with the traditional vision guide landing scheme, a background of an image is the monotonous sky, so that the interference of the background is effectively reduced and the noise is reduced; and meanwhile, an image processing task can be finished by a ground high-performance computer, and the instantaneity and the calculation precision are effectively improved compared with an airborne computer.

Description

Helicopter landing guiding device and method based on computer vision
Technical Field
The invention relates to the technical field of aircraft navigation, in particular to a landing visual guidance method for an unmanned helicopter and a device for realizing the method.
Background
The autonomous landing of the micro unmanned helicopter is a very complicated flight phase, and requires that accurate self-motion information such as altitude, speed, course and the like is obtained under the conditions of low speed and low altitude and is fed back to an airborne flight control system so as to control the position and the posture of the airplane and land to a specified position.
The traditional unmanned aerial vehicle landing guidance method is to fuse data of an airborne Global Positioning System (GPS), an Inertial Navigation System (INS) and an electronic compass to obtain pose information of a helicopter. However, the GPS system cannot provide sufficiently accurate pose information due to the limitation of the accuracy of the airborne GPS system and due to landing noise interference, occlusion, and the like. However, the INS and the electronic compass often have large integration errors, and cannot independently and accurately provide positioning information.
In response to this problem, in recent years, a landing guidance scheme has appeared in which a visual sensor is used to photograph a ground mark to acquire pose information of a helicopter. However, the vision method needs to perform complex operation processing on the acquired image to extract the pose information. And limited by the size and load capacity of the helicopter, the performance of the onboard computer system is limited, which means that great problems exist in the aspects of real-time operation and operation precision. In addition, the airborne vision sensor has the problems of image jitter, smoke shielding and the like in different degrees when acquiring ground images, so a new and more complete helicopter landing scheme still needs to be found.
Disclosure of Invention
In order to improve the real-time performance and the precision of the measurement of the pose of the airplane, the invention provides a device and a method for guiding the helicopter to land at an autonomous fixed point by taking a plastic buffer ball at the tail end of a landing frame of the helicopter as a characteristic angular point, tracking the angular point in real time by using a visual sensor and acquiring the pose information of the helicopter through an image processing algorithm.
The purpose of the invention is realized by the following technical scheme: a helicopter landing guide apparatus comprising: the system comprises a ground three-camera vision sensor system, a foreground searching module, a tracking matching module, a pose resolving module, a wireless data transmission module and the like, wherein the ground three-camera vision sensor system, the foreground searching module, the tracking matching module, the pose resolving module and the wireless data transmission module are sequentially connected, and the ground three-camera vision sensor system is connected with the tracking matching module.
A landing visual guidance method for an unmanned helicopter comprises the following steps:
(1) the ground three-camera vision sensor system collects images;
(2) searching image foregrounds, namely four airborne artificial landmarks with the colors of red, yellow, blue and green in the image acquired in the step 1;
(3) tracking the artificial landmark in real time by using a particle filtering algorithm;
(4) calculating the position and the attitude of the helicopter relative to a ground coordinate system by a pose resolving module;
(5) the ground station computer transmits the pose information obtained in the step 4 to a receiver connected to the flight control computer through a wireless data transmission module, and then transmits the pose information to the flight control computer through an RS-232 interface;
(6) and the flight control computer guides the helicopter to land to a specified position according to the pose information.
The helicopter has the advantages that because the scheme that the artificial landmarks are arranged on the helicopter body and the ground multi-camera vision sensor system shoots the air landmarks is adopted, compared with the scheme that the ground artificial landmarks are shot by the helicopter airborne vision sensor, the background of the image is monotonous sky, the interference of the background is effectively reduced, and the noise is reduced; meanwhile, the ground high-performance computer completes the image processing task, and compared with an airborne computer, the real-time performance and the calculation precision are effectively improved. In addition, this scheme adopts many cameras vision system to gather image information from three angle and fuses, can effectively solve the problem of sheltering from between the ground mark, has increased the measurement space scope of system. The invention only adds four artificial landmarks and three cameras on the basis of the original hardware, fully utilizes the original ground station computer and the wireless data transmission module, and effectively controls the cost.
Drawings
The invention is further illustrated with reference to the following figures and examples:
FIG. 1 is a functional block diagram of a helicopter landing gear of the present invention;
FIG. 2 is a diagram of a three camera visual sensor system on the ground;
FIG. 3 is a foreground search module software flow diagram;
FIG. 4 is a trace matching module software flow diagram;
FIG. 5 is a pose resolving module software flow diagram;
FIG. 6 is a flow chart illustrating a helicopter landing method of the present invention.
Detailed Description
Fig. 1 is an example of a functional block diagram of a helicopter landing guidance apparatus to which the present invention relates.
As shown in fig. 1, the helicopter landing guiding apparatus of the present invention comprises: the ground three-camera vision sensor system 1, the foreground searching module 2, the tracking matching module 3, the pose resolving module 4 and the wireless data transmission module 5 are sequentially connected, and the ground three-camera vision sensor system 1 is connected with the tracking matching module 3.
As shown in fig. 2, the three-camera vision sensor system on the ground comprises three optical cameras, and the connection relationship of the three optical cameras is shown in fig. 2. The optical camera may employ a product C160 of a romo corporation, but is not limited thereto.
As shown in fig. 3, the foreground search module works as follows: converting an image output by a visual sensor from an RGB domain into an HSV domain; screening and detecting required color areas, namely red, yellow, blue and green areas corresponding to the four artificial landmarks; turning over the color areas into reverse colors, namely green, blue, yellow and red; converting the two images into Gray images to remove noise, and performing difference operation to eliminate the background; then applying Canny operator to the image to find the edge of the sphere; and finally, solving the radii of the four circles and the coordinates of the circle center in the image by applying Hough transform. The module can be realized by calling OpenCV library functions in C + + language, but is not limited to the method.
The artificial landmarks are four hard plastic buffer balls fixed at the tail end of a helicopter landing frame, and the positions of the hard plastic buffer balls relative to the mass center of the helicopter are calibrated. The artificial landmarks can be, but are not limited to, the landing frame attached to the T-REX 600 Nitro Super Pro helicopter available from Futaba corporation of Japan.
As shown in fig. 4, the operation of the trace matching module is as follows: setting the number of particles, and determining a motion model as a first-order self-adaptive model; the four landmarks detected in S2 are set as tracking targets, and the target model is built using the weighted color histogram: aiming at each artificial landmark, the center coordinate of a projection circle as an object is set as y, the quantization level number of the histogram is set as m, and the pixel coordinate in the tracking area is set as { x }i}i=1,2,…,NEstablishing a weighted color histogram target model p = { p = { p }u(y)}u=1,2,..,m Wherein:
Figure 2012100005932100002DEST_PATH_IMAGE001
Figure 4712DEST_PATH_IMAGE002
wherein,
Figure 2012100005932100002DEST_PATH_IMAGE003
is the kernel function bandwidth;
Figure 776358DEST_PATH_IMAGE004
is a profile function corresponding to the kernel function; dividing the pixel points into response bins of the histogram according to the color values of the pixel points by the function b;
Figure 2012100005932100002DEST_PATH_IMAGE005
is a function of Kronecker; u is a histogram quantization level;
and establishing a total of N particles in the initial state particle set:
Figure 789314DEST_PATH_IMAGE006
wherein,
Figure 2012100005932100002DEST_PATH_IMAGE007
and the coordinate and the radius information of the center of the landmark projection circle in the ith particle and the 0 th frame time are represented, and the initial weight value of each particle is 1/N. Carrying out system state transfer in subsequent image frames to randomly spread particles, and solving the weight of the particles; obtaining the coordinate position of each target in the image plane after particle weighting; outputting a coordinate position value; and then resampled to redistribute the positions of the particles. The module can be realized by C + + language, but not limited to this.
As shown in fig. 5, the design implementation process of the pose resolving module includes the following steps: the known transformation relationship between the helicopter coordinate system and the camera coordinate system is as follows:
Figure 484869DEST_PATH_IMAGE008
wherein (x)c,yc,zc)(xh,yh,zh) Respectively representing the homogeneous coordinate of the center of the landmark under a camera coordinate system and the homogeneous coordinate under a helicopter coordinate system; solving a rotation matrix R and a displacement matrix t of a helicopter coordinate system relative to an original point of a camera coordinate system by using an iterative least square algorithm; the yaw angle, the roll angle and the pitch angle of the helicopter can be obtained by resolving the rotation matrix; the height information of the helicopter and the distance information relative to the landing point can be obtained by solving the displacement matrix.
The wireless data transmission module may be, but is not limited to, the APC200A-43 module manufactured by the ann american technologies ltd.
As shown in fig. 6, the visual guidance method for landing of unmanned helicopter of the present invention comprises the following steps:
in the beginning stage, the helicopter is assumed to fly to the effective visual field range of the ground vision sensor system under the guidance of the global positioning system. And the flight control computer is switched to the helicopter landing guide device to provide high-precision pose information required by landing.
1. The three-camera vision sensor system on the ground collects images.
The ground three-camera vision sensor system collects images in real time and transmits the collected images to the foreground searching module.
2. And (3) searching image foregrounds, namely four airborne artificial landmarks with the colors of red, yellow, blue and green in the image acquired in the step (1).
The specific operation is as follows: the foreground searching module converts an image output by the ground three-camera vision sensor system from an RGB domain into an HSV domain; screening and detecting required color areas, namely red, yellow, blue and green areas corresponding to the four artificial landmarks; turning over the color areas into reverse colors, namely green, blue, yellow and red; converting the two images into Gray images to remove noise, and performing difference operation to eliminate the background; then applying Canny operator to the image to find the edge of the sphere; and finally, solving the radii of the four circles and the coordinates of the circle center in the image by applying Hough transform.
3. Real-time tracking of the above-mentioned artificial landmarks using particle filtering algorithms
The specific operation is as follows: in a tracking matching module, setting the number of particles and determining a motion model as a first-order self-adaptive model; setting the four landmarks detected in the step 2 as tracking targets, and establishing a target model by using a weighted color histogram: aiming at each artificial landmark, the center coordinate of a projection circle as an object is set as y, the quantization level number of the histogram is set as m, and the pixel coordinate in the tracking area is set as { x }i}i=1,2,…,NEstablishing a weighted color histogram target model p = { p = { p }u(y)}u=1,2,..,m Wherein:
Figure 837352DEST_PATH_IMAGE001
Figure 842218DEST_PATH_IMAGE002
wherein,
Figure 280152DEST_PATH_IMAGE003
is the kernel function bandwidth;
Figure 513819DEST_PATH_IMAGE004
is a profile function corresponding to the kernel function; dividing the pixel points into response bins of the histogram according to the color values of the pixel points by the function b;is a function of Kronecker; u is histogram quantization level;
And establishing a total of N particles in the initial state particle set
Figure 99838DEST_PATH_IMAGE006
WhereinAnd the coordinate and the radius information of the center of the landmark projection circle in the ith particle and the 0 th frame time are represented, and the initial weight value of each particle is 1/N. Carrying out system state transfer in subsequent image frames to randomly spread particles, and solving the weight of the particles; obtaining the coordinate position of each target in the image plane after particle weighting; outputting a coordinate position value; and then resampled to redistribute the positions of the particles.
4. And calculating the position and the attitude of the helicopter relative to a ground coordinate system by the pose resolving module.
The specific operation is as follows: the known transformation relationship between the helicopter coordinate system and the camera coordinate system is as follows:
Figure 311693DEST_PATH_IMAGE008
wherein (x)c,yc,zc)(xh,yh,zh) Respectively representing the homogeneous coordinate of the center of the landmark under a camera coordinate system and the homogeneous coordinate under a helicopter coordinate system; solving a rotation matrix R and a displacement matrix t of a helicopter coordinate system relative to an original point of a camera coordinate system by using an iterative least square algorithm; the yaw angle, the roll angle and the pitch angle of the helicopter can be obtained by resolving the rotation matrix; the height information of the helicopter and the distance information relative to the landing point can be obtained by solving the displacement matrix.
5. The ground station computer transmits the pose information to a receiver connected to the flight control computer through a wireless data transmission module, and then transmits the pose information to the flight control computer through an RS-232 interface.
6. And the flight control computer guides the helicopter to land to a specified position according to the pose information.
The unmanned helicopter can realize safe fixed-point landing according to the steps.

Claims (5)

1. A helicopter landing guide apparatus, comprising: the system comprises a ground three-camera vision sensor system, a foreground searching module, a tracking matching module, a pose resolving module, a wireless data transmission module and the like, wherein the ground three-camera vision sensor system, the foreground searching module, the tracking matching module, the pose resolving module and the wireless data transmission module are sequentially connected, and the ground three-camera vision sensor system is connected with the tracking matching module.
2. An unmanned helicopter landing visual guidance method applying the helicopter landing guidance device of claim 1, characterized in that the method comprises the following steps:
(1) the ground three-camera vision sensor system collects images;
(2) searching image foregrounds, namely four airborne artificial landmarks with the colors of red, yellow, blue and green in the image acquired in the step 1;
(3) tracking the artificial landmark in real time by using a particle filtering algorithm;
(4) calculating the position and the attitude of the helicopter relative to a ground coordinate system by a pose resolving module;
(5) the ground station computer transmits the pose information obtained in the step 4 to a receiver connected to the flight control computer through a wireless data transmission module, and then transmits the pose information to the flight control computer through an RS-232 interface;
(6) and the flight control computer guides the helicopter to land to a specified position according to the pose information.
3. The visual guidance method for landing of an unmanned helicopter of claim 2, wherein said step 2 is specifically: converting an image output by a ground three-camera vision sensor system from an RGB domain into an HSV domain; screening and detecting required color areas, namely red, yellow, blue and green areas corresponding to the four artificial landmarks; turning over the color areas into reverse colors, namely green, blue, yellow and red; converting the two images into Gray images to remove noise, and performing difference operation to eliminate the background; then applying Canny operator to the image to find the edge of the sphere; and finally, solving the radii of the four circles and the coordinates of the circle center in the image by applying Hough transform.
4. The visual guidance method for landing of an unmanned helicopter of claim 2, wherein said step 3 is specifically: setting the number of particles, and determining a motion model as a first-order self-adaptive model; the four landmarks detected in step 2 are set as tracking targets,and using the weighted color histogram to build a target model: aiming at each artificial landmark, the center coordinate of a projection circle as an object is set as y, the quantization level number of the histogram is set as m, and the pixel coordinate in the tracking area is set as { x }i}i=1,2,…,NEstablishing a weighted color histogram target model: p = { pu(y)}u=1,2,..,m Wherein:
Figure 673914DEST_PATH_IMAGE001
Figure 103758DEST_PATH_IMAGE002
wherein,
Figure 407701DEST_PATH_IMAGE003
is the kernel function bandwidth;is a profile function corresponding to the kernel function; dividing the pixel points into response bins of the histogram according to the color values of the pixel points by the function b;
Figure 136415DEST_PATH_IMAGE005
is a function of Kronecker; u is a histogram quantization level;
and establishing a total of N particles in the initial state particle set:
wherein
Figure 262820DEST_PATH_IMAGE007
Representing the circle center coordinates and radius information of the landmark projection circle in the ith particle at the 0 th frame time, wherein the initial weight value of each particle is 1/N; system state transitions in subsequent image frames toRandomly spreading the particles and solving the weight of the particles; obtaining the coordinate position of each target in the image plane after particle weighting; outputting a coordinate position value; and then resampled to redistribute the positions of the particles.
5. The visual guidance method for landing of an unmanned helicopter of claim 2, wherein said step 5 is specifically: the known transformation relationship between the helicopter coordinate system and the camera coordinate system is as follows:
Figure 341634DEST_PATH_IMAGE008
wherein (x)c,yc,zc)(xh,yh,zh) Respectively representing the homogeneous coordinate of the center of the landmark under a camera coordinate system and the homogeneous coordinate under a helicopter coordinate system; solving a rotation matrix R and a displacement matrix t of a helicopter coordinate system relative to an original point of a camera coordinate system by using an iterative least square algorithm; the yaw angle, the roll angle and the pitch angle of the helicopter can be obtained by resolving the rotation matrix; the height information of the helicopter and the distance information relative to the landing point can be obtained by solving the displacement matrix.
CN201210000593.2A 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision Expired - Fee Related CN102538782B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210000593.2A CN102538782B (en) 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210000593.2A CN102538782B (en) 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision

Publications (2)

Publication Number Publication Date
CN102538782A true CN102538782A (en) 2012-07-04
CN102538782B CN102538782B (en) 2014-08-27

Family

ID=46346273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210000593.2A Expired - Fee Related CN102538782B (en) 2012-01-04 2012-01-04 Helicopter landing guide device and method based on computer vision

Country Status (1)

Country Link
CN (1) CN102538782B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
CN103413466A (en) * 2013-07-08 2013-11-27 中国航空无线电电子研究所 Airborne visible ground guide and warning device and guide and warning method thereof
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105068548A (en) * 2015-08-12 2015-11-18 北京贯中精仪科技有限公司 Landing guide system of unmanned aerial vehicle
CN105182994A (en) * 2015-08-10 2015-12-23 普宙飞行器科技(深圳)有限公司 Unmanned-aerial-vehicle fixed-point landing method
CN105676875A (en) * 2015-03-10 2016-06-15 张超 Automatic landing system of unmanned aerial vehicle
CN105979119A (en) * 2016-06-02 2016-09-28 深圳迪乐普数码科技有限公司 Method for filtering infrared rocking arm tracking motion data and terminal
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN109521791A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Identification method and device based on earth station
CN109791413A (en) * 2016-10-10 2019-05-21 高通股份有限公司 For making system and method for the UAV Landing on mobile foundation
CN113154220A (en) * 2021-03-26 2021-07-23 苏州略润娇贸易有限公司 Easily-built computer for construction site and use method thereof
CN114812513A (en) * 2022-05-10 2022-07-29 北京理工大学 Unmanned aerial vehicle positioning system and method based on infrared beacon

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000243A (en) * 2007-01-16 2007-07-18 北京航空航天大学 Pilotless plane landing navigation method and its device
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
CN101833104A (en) * 2010-04-27 2010-09-15 北京航空航天大学 Three-dimensional visual navigation method based on multi-sensor information fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
CN101000243A (en) * 2007-01-16 2007-07-18 北京航空航天大学 Pilotless plane landing navigation method and its device
CN101833104A (en) * 2010-04-27 2010-09-15 北京航空航天大学 Three-dimensional visual navigation method based on multi-sensor information fusion

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
CN103413466A (en) * 2013-07-08 2013-11-27 中国航空无线电电子研究所 Airborne visible ground guide and warning device and guide and warning method thereof
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105068553A (en) * 2015-03-10 2015-11-18 无锡桑尼安科技有限公司 Unmanned aerial vehicle automatic landing system
CN105676875A (en) * 2015-03-10 2016-06-15 张超 Automatic landing system of unmanned aerial vehicle
CN105182994B (en) * 2015-08-10 2018-02-06 普宙飞行器科技(深圳)有限公司 A kind of method of unmanned plane pinpoint landing
CN105182994A (en) * 2015-08-10 2015-12-23 普宙飞行器科技(深圳)有限公司 Unmanned-aerial-vehicle fixed-point landing method
CN105068548A (en) * 2015-08-12 2015-11-18 北京贯中精仪科技有限公司 Landing guide system of unmanned aerial vehicle
CN105068548B (en) * 2015-08-12 2019-06-28 北京贯中精仪科技有限公司 UAV Landing guides system
CN105979119A (en) * 2016-06-02 2016-09-28 深圳迪乐普数码科技有限公司 Method for filtering infrared rocking arm tracking motion data and terminal
CN105979119B (en) * 2016-06-02 2019-07-16 深圳迪乐普数码科技有限公司 A kind of filtering method and terminal of infrared rocker arm pursuit movement data
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN107544550B (en) * 2016-06-24 2021-01-15 西安电子科技大学 Unmanned aerial vehicle automatic landing method based on visual guidance
CN109791413A (en) * 2016-10-10 2019-05-21 高通股份有限公司 For making system and method for the UAV Landing on mobile foundation
CN109521791A (en) * 2018-09-28 2019-03-26 易瓦特科技股份公司 Identification method and device based on earth station
CN113154220A (en) * 2021-03-26 2021-07-23 苏州略润娇贸易有限公司 Easily-built computer for construction site and use method thereof
CN114812513A (en) * 2022-05-10 2022-07-29 北京理工大学 Unmanned aerial vehicle positioning system and method based on infrared beacon

Also Published As

Publication number Publication date
CN102538782B (en) 2014-08-27

Similar Documents

Publication Publication Date Title
CN102538782B (en) Helicopter landing guide device and method based on computer vision
CN108711166B (en) Monocular camera scale estimation method based on quad-rotor unmanned aerial vehicle
US10650235B2 (en) Systems and methods for detecting and tracking movable objects
CN106529495B (en) Obstacle detection method and device for aircraft
US20190220650A1 (en) Systems and methods for depth map sampling
US10703479B2 (en) Unmanned aerial vehicle, control systems for unmanned aerial vehicle and control method thereof
CN112567201A (en) Distance measuring method and apparatus
US10895458B2 (en) Method, apparatus, and system for determining a movement of a mobile platform
CN106529538A (en) Method and device for positioning aircraft
CN108235815B (en) Imaging control device, imaging system, moving object, imaging control method, and medium
CN104215239A (en) Vision-based autonomous unmanned plane landing guidance device and method
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
WO2018027339A1 (en) Copyright notice
CN111829532B (en) Aircraft repositioning system and method
WO2022021027A1 (en) Target tracking method and apparatus, unmanned aerial vehicle, system, and readable storage medium
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
US11842440B2 (en) Landmark location reconstruction in autonomous machine applications
CN111736190B (en) Unmanned aerial vehicle airborne target detection system and method
CN114581516A (en) Monocular vision-based multi-unmanned aerial vehicle intelligent identification and relative positioning method
CN207068060U (en) The scene of a traffic accident three-dimensional reconstruction system taken photo by plane based on unmanned plane aircraft
CN112950671A (en) Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN105807083B (en) A kind of unmanned vehicle real time speed measuring method and system
Zhang et al. An UAV navigation aided with computer vision
WO2022021028A1 (en) Target detection method, device, unmanned aerial vehicle, and computer-readable storage medium
Eynard et al. UAV Motion Estimation using Hybrid Stereoscopic Vision.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140827

Termination date: 20150104

EXPY Termination of patent right or utility model