KR101733677B1 - Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor - Google Patents

Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor Download PDF

Info

Publication number
KR101733677B1
KR101733677B1 KR1020150082648A KR20150082648A KR101733677B1 KR 101733677 B1 KR101733677 B1 KR 101733677B1 KR 1020150082648 A KR1020150082648 A KR 1020150082648A KR 20150082648 A KR20150082648 A KR 20150082648A KR 101733677 B1 KR101733677 B1 KR 101733677B1
Authority
KR
South Korea
Prior art keywords
landing
artificial landmark
image
take
artificial
Prior art date
Application number
KR1020150082648A
Other languages
Korean (ko)
Other versions
KR20160146062A (en
Inventor
정진호
Original Assignee
주식회사 두시텍
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 두시텍 filed Critical 주식회사 두시텍
Priority to KR1020150082648A priority Critical patent/KR101733677B1/en
Publication of KR20160146062A publication Critical patent/KR20160146062A/en
Application granted granted Critical
Publication of KR101733677B1 publication Critical patent/KR101733677B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • B64D45/08Landing aids; Safety measures to prevent collision with earth's surface optical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • B64C2201/18

Abstract

The present invention relates to an apparatus and method for precise landing of an unmanned airplane using an artificial landmark and an ultrasonic sensor, and more particularly, to an apparatus and method for landing unmanned airplane using an artificial landmark and an ultrasonic sensor, The present invention also provides an apparatus and method for precise landing of an unmanned airplane using an artificial marker and an ultrasonic sensor for enabling precise landing of an unmanned airplane based on distance information obtained through an ultrasonic sensor using ultrasonic distance detection technology.

Description

Technical Field [0001] The present invention relates to an apparatus and a method for precise landing of an unmanned airplane using an artificial marker and an ultrasonic sensor,

The present invention relates to an apparatus and method for precise landing of an unmanned airplane using an artificial landmark and an ultrasonic sensor, and more particularly, to an apparatus and method for landing unmanned airplane using an artificial landmark and an ultrasonic sensor, The present invention relates to an apparatus and method for precise landing of an unmanned aerial vehicle using an artificial landmark and an ultrasonic sensor for enabling precise landing of an airplane.

It is most important to know the location of the unmanned airplane and the landing point in order for the unmanned airplane to land automatically.

There are two methods of locating the moving object: using a position sensor such as GPS, and recognizing the surrounding environment using a camera image. However, they have disadvantages.

In case of general GPS sensor, the position accuracy is very low (about 10m in position error) and the GPS sensor with high accuracy is very expensive. However, even in the case of high-priced GPS products, the location accuracy is very low when there are many buildings in the vicinity (in the shadow area of the GPS). Also,

On the other hand, a position recognition method using a camera image is a method of locating the position of the user by matching the previously stored (i.e., known) image information with the input image. However, the general image recognition has a problem that the recognition success rate is low and the image processing takes a long time because the image changes greatly depending on weather, time, and illumination due to the characteristics of the image.

Therefore, in practical applications, a method of attaching and recognizing an artificial landmark, which is manufactured to facilitate identification, is attached to a floor, a ceiling or a wall.

In addition, general image recognition has a problem in that it is difficult to identify an image at a close range (within 50 cm).

Prior art related to the position recognition method using the artificial landmark is disclosed in Korean Patent Laid-Open No. 10-2007-0066192 (Method and apparatus for recognizing the position of an indoor mobile robot), Korean Patent Laid-Open No. 10-2011-0066714 An apparatus and method for recognizing the position of a mobile robot).

Korean Unexamined Patent Publication No. 10-2007-0066192 proposes a marking in which circular marks 1 and 2 are arranged inside a square patch 5 as shown in FIG.

Korean Patent Laid-Open No. 10-2011-0066714 proposes a method of identifying IDs between markers using a grid pattern in a rectangular patch 7 as shown in FIG.

Korean Patent Publication [10-2007-0066192] (Registered on Jun. 27, 2007) Korean Patent Publication [10-2011-0066714] (Published date: June 17, 2011)

SUMMARY OF THE INVENTION Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide an artificial marker and an ultrasonic sensor capable of precise landing of an unmanned airplane using an artificial marker recognition technique and an ultrasonic distance detection technique. And to provide a precision landing apparatus and method using the unmanned aerial vehicle.

The objects of the embodiments of the present invention are not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood by those skilled in the art from the following description .

According to an embodiment of the present invention, there is provided an apparatus for precision landing of an unmanned aerial vehicle using an artificial landmark and an ultrasonic sensor, comprising: an image acquisition unit (100) for acquiring an external image; An ultrasonic sensor 200; An airplane is moved to a precise altitude calculation height based on the information obtained by detecting an artificial landmark from the image obtained by using the image obtaining unit 100 and the artificial landmark is obtained from the image obtained using the image obtaining unit 100 Based on the detected information and the distance information measured by the ultrasonic sensor 200, the airplane is moved from the altitude calculation altitude to the landing standby height. Based on the distance information measured by the ultrasonic sensor 200, And a precise landing control unit (300) for landing the unmanned airplane until landing.

In addition, the precise landing control unit 300 includes a take-off information storage unit 310 for receiving and storing artificial landmark specification information including flight center information, flight mission information, and GPS information from the outside. An artificial landmark detection unit 320 for detecting an artificial landmark from the image acquired from the image acquisition unit 100; An information management unit 330 for determining a landing direction based on the artificial landmark image detected by the artificial landmark detection unit 320; An access information calculation unit for calculating access information for accessing the airplane at the position of the artificial landmark based on the artificial landmark information stored in the take-off information storage unit 310 and the artificial landmark image detected by the artificial landmark detection unit 320 (340); A center coordinate calculator (not shown) for calculating center coordinates and error information of the airplane based on the center coordinate information of the artificial landmark information stored in the take-off information storage unit 310 and the artificial landmark image detected by the artificial land marker detector 320 350); And an accurate altitude calculation unit 360 for calculating an altitude of the airplane based on the information obtained from the ultrasonic sensor 200.

In addition, the information management unit 330 determines the magnetic north in the GPS position coordinates based on the artificial landmark information and the artificial landmark image, and determines the landing direction based on the artificial landmark angle determined by the north-north determination of the center coordinate recognition reference .

In the precise landing method of the unmanned aerial vehicle using the artificial landmark and the ultrasonic sensor according to the embodiment of the present invention, the precision landing control unit receives the artificial landmark information including the center coordinate information and the flight mission information from the outside, (S10); The landing position image is acquired using the image acquisition unit 100 and the precision landing control unit acquires the take-off position image and the artificial landmark information of the take-off preparation step (S10) An automatic take-off step (S20) of determining a landing direction to the landing direction; A mission performing step (S30) in which the unmanned airplane performs a flight mission based on the flight mission information in the take-off preparation step (S10); After the unmanned airplane performs the flight mission, the precise landing control unit moves the unmanned airplane to the corresponding GPS coordinates based on the GPS information of the taking-ready step (S10), acquires the landing point image using the image acquisition unit 100 An artificial landmark tracking step (S40) of detecting an artificial landmark image among the obtained landing point images; The precise landing control unit synchronizes the position of the airplane up to a predetermined precise altitude calculation height based on the artificial landmark image detected in the artificial landmark tracking step S40 and detects the artificial landmark image detected in the artificial landmark tracking step S40 An artificial landmark-based descent step (S50) for synchronizing with the landing direction of the automatic take-off step (S20) on the basis of the landing direction; The precision landing control unit calculates the accurate altitude using the ultrasonic sensor 200 from the altitude calculation altitude to a predetermined landing altitude, and synchronizes the position and landing direction of the airplane based on the artificial landmark image and the altitude Based descent step S60; And an ultrasonic sensor-based landing step (S70) in which the precision landing control unit vertically descends the unmanned airplane based on the precision altitude value using the ultrasonic sensor (200) from a predetermined landing waiting height.

In addition, the automatic take-off step S20 includes a take-off image acquiring step S21 for acquiring an image of a take-off position using the image acquiring unit 100; A take-off artificial landmark detection step (S22) in which the precise landing control unit detects an artificial landmark image among the images acquired in the take-in image acquisition step (S21); And the precise landing control unit determines the magnetic north in the GPS position coordinates based on the artificial landmark specification information of the take-off preparation step (S10) and the artificial landmark image acquired in the take-off artificial landmark detection step (S20) And a landing setting step (S23) of determining a landing direction based on the artificial landmark angle determined by the magnetic north determination.

In addition, the artificial landmark-based descent step S50 is a step in which the precision landing control unit calculates the precision position of the artificial landmark center coordinate based on the artificial landmark image detected in the artificial landmark tracking step S40, (Step S51); And an angle synchronization step (S52) of synchronizing the position of the airplane with the landing direction of the automatic take-off step (S20) until the precise landing control unit reaches a predetermined accurate altitude calculation height, do.

According to the unmanned aerial vehicle precise landing apparatus and method using artificial landmarks and ultrasonic sensors according to an embodiment of the present invention, errors of GPS information can be corrected using artificial landmark recognition technology, It has the effect of reducing the impact when the plane lands on the ground and automatically landing the unmanned airplane accurately and safely.

In addition, when the takeoff direction of the unmanned airplane is matched with the landing direction, it is possible to automatically land the unmanned airplane more safely.

1 is a block diagram of a precise landing gear of a UAV using an artificial landmark and an ultrasonic sensor according to an embodiment of the present invention.
Fig. 2 is a detailed view of Fig. 1; Fig.
3 to 5 are flowcharts for explaining a method for precise landing of a UAV using an artificial landmark and an ultrasonic sensor according to an embodiment of the present invention.
FIG. 6 is a conceptual diagram for explaining a technique applied to take-off and landing of an unmanned airplane; FIG.

Hereinafter, the present invention will be described in more detail with reference to the accompanying drawings. Prior to this, terms and words used in the present specification and claims should not be construed as limited to ordinary or dictionary terms, and the inventor should appropriately interpret the concept of the term appropriately in order to describe its own invention in the best way. The present invention should be construed in accordance with the meaning and concept consistent with the technical idea of the present invention. Further, it is to be understood that, unless otherwise defined, technical terms and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Descriptions of known functions and configurations that may be unnecessarily blurred are omitted. The following drawings are provided by way of example so that those skilled in the art can fully understand the spirit of the present invention. Therefore, the present invention is not limited to the following drawings, but may be embodied in other forms. In addition, like reference numerals designate like elements throughout the specification. It is to be noted that the same elements among the drawings are denoted by the same reference numerals whenever possible.

Prior to the description, the terms used in this specification (and claims) will be briefly described.

'Artificial marker recognition technology' refers to a technique for calculating position and angle based on artificial markers extracted from images.

'Ultrasonic distance detection technology' refers to the technique of measuring the distance to the target point using the ultrasound arrival time.

FIG. 1 is a block diagram of a precise landing device of an unmanned aerial vehicle using artificial landmarks and ultrasonic sensors according to an embodiment of the present invention. FIG. 2 is a detailed view of FIG. 1, and FIGS. FIG. 6 is a conceptual diagram for explaining a technology applied to take-off and landing of an unmanned airplane. FIG. 6 is a flow chart for explaining a detailed landing method of an unmanned aerial vehicle using an artificial landmark and an ultrasonic sensor according to an example.

As shown in FIG. 1, an unmanned aerial vehicle precision landing apparatus using an artificial landmark and an ultrasonic sensor according to an embodiment of the present invention includes an image acquisition unit 100, an ultrasonic sensor 200, a precision landing control unit 300;

The image acquisition unit 100 acquires an external image using a camera or the like.

The ultrasonic sensor 200 generates ultrasonic waves and receives reflected ultrasonic waves.

The precise landing control unit 300 moves the airplane to the precise altitude calculation height based on the information obtained by detecting the artificial landmark from the image obtained using the image obtaining unit 100, The airplane is moved from the altitude calculation altitude to the landing air height based on the information obtained by detecting the artificial landmark from the acquired image and the distance information measured by the ultrasonic sensor 200. The distance information measured by the ultrasonic sensor 200 , Landing the unmanned airplane from landing height to landing.

Here, the precision altitude calculation height refers to a height at which the ultrasonic sensor 200 is operated. In order to compensate for the insufficient height of the image obtained by the image acquisition unit 100 with the ultrasonic sensor 200 It refers to the set height. For example, you can set the precision altitude calculation height within 3 m.

Since the landing waiting height is limited by the recognizable height of the image acquired by the image acquisition unit 100, the landing air height may be determined depending on the distance measured by the ultrasonic sensor 200 without using the image information acquired by the image acquisition unit 100 It refers to the height set for landing. For example, setting the landing waiting height within 1 m (generally, images obtained within 50 Cm are not in focus and difficult to use).

That is, artificial landmark recognition technology is used until the unmanned airplane approaches the altitude calculation altitude, and artificial landmark recognition technology and ultrasonic distance detection technology are used from the altitude calculation altitude to the landing air height. Until the landing of the unmanned airplane using ultrasound distance detection technology.

2, the precise landing control unit 300 includes a take-off information storage unit 310, an artificial landmark detection unit 320, an information management unit 330, an access information calculation unit 340, a center coordinate calculation unit 350 and an accurate altitude calculation unit 360. [

The take-off information storage unit 310 receives and stores artificial landmark information including flight center information, flight mission information, and GPS information from the outside.

An artificial landmark is a mark that corresponds to a specific rule, which is necessary to confirm the landing point and landing direction of the UAV. That is, when the artificial landmark is detected from the image, the magnetic north can be determined in the GPS position coordinate, and the artificial landmark angle can be calculated by the magnetic north determination of the center coordinate recognition reference.

That is, after the center coordinates are determined, the landing direction can be determined based on the GPS coordinates and magnetic north information.

The artificial marker detection unit 320 detects an artificial marker from the image acquired from the image acquisition unit 100.

At this time, the artificial marker detection unit 320 detects an artificial marker in the image acquired from the image acquisition unit 100 based on the artificial marker standard information.

The information management unit 330 determines the landing direction based on the artificial landmark image detected by the artificial landmark detection unit 320. [

The landing direction of the UAV can be determined by taking the angle of the UAV when the UAV is taken off, and the landing direction can be determined based on the detection of the UAV from the image acquired at a certain height after the takeoff of the UAV.

At this time, the information management unit 330 determines the magnetic north in the GPS position coordinates based on the artificial landmark information and the artificial landmark image, and determines the landing direction based on the artificial landmark angle determined by the north-north determination of the center coordinate recognition reference . ≪ / RTI >

In other words, if an artificial landmark is detected from an image acquired at a certain height after takeoff of the UAV, it is possible to determine in which direction the magnetic north direction is compared with the artificial landmark information. Based on this, And the direction can be determined as the landing direction.

The access information calculation unit 340 calculates the access information based on the artificial landmark information stored in the take-off information storage unit 310 and the artificial landmark image detected by the artificial landmark detection unit 320, And calculates access information.

The access information calculation unit 340 calculates access information for moving the unmanned airplane to be landed to coordinates of artificial landmarks. The access information can be used to calculate rotation angle processing information for angle extraction between an unmanned aerial vehicle and an artificial landmark for approaching an artificial landmark at an altitude of 10 m or more. The formula for determining the direction and angle from a reference point to a specific point in an image can be obtained using a trigonometric function (atan). Since the image is processed in units of pixels, it is processed based on the pixels. The reference point can be defined as the center of the bottom of the image, and the center coordinates of the desired object in the image are found to obtain the difference from the reference pixel.

The center coordinate calculation unit 350 calculates center coordinates of the center coordinates of the airplane based on the center coordinate information of the artificial landmark information stored in the take-off information storage unit 310 and the artificial landmark image detected by the artificial landmark detection unit 320 .

It is possible to calculate the error position in the center coordinates of the artificial landmark based on the unmanned airplane, and it is possible to check how far the unmanned airplane is located in which direction.

That is, according to the center coordinates and the position information of the UAV, it is possible to induce the UAV to move to the center coordinate point by outputting the control information capable of moving the UAV to the center coordinates.

The precision altitude calculation unit 360 calculates the altitude of the airplane based on the information obtained from the ultrasonic sensor 200.

That is, when the ultrasonic sensor sends ultrasonic waves in a direction perpendicular to the ground and receives reflected ultrasonic waves, the height from the ground can be calculated.

3 and 6, the method for precisely landing a UAV using an artificial landmark and an ultrasonic sensor according to an embodiment of the present invention includes a take-off preparation step S10, an automatic take-off step S20, S30), artificial marker tracking step S40, artificial marker based descent step S50, dual based descent step S60, and ultrasonic sensor based landing step S70.

In the take-off preparation step S10, the precision landing control unit receives the artificial landmark information including the center coordinate information and the flight mission information from the outside, and acquires the GPS information of the take-off position.

That is, after receiving the input of the artificial landmarks of the unmanned airplane before the flight of the unmanned airplane, the GPS information of the take-off position is received immediately before takeoff.

In the automatic take-off step S20, the take-up position image is acquired by using the image obtaining unit 100 after the elevator has risen to a predetermined height at the time of taking-off of the unmanned airplane, The landing direction is determined based on the artificial landmark information of the landing direction.

As shown in FIG. 4, the automatic take-off step S20 may include a take-off image acquiring step S21, a take-off artificial landmark detection step S22, and a landing setting step S23.

The take-in image acquisition step S21 acquires the image of the take-off position using the image acquisition unit 100. [

The take-in image acquiring step S21 acquires an image after ascending to a predetermined height of the UAV. It is preferable that the UAV acquires the image at a height at which an image easy to extract artificial landmarks can be acquired. It is preferable to acquire an image within 5 m.

The take-off artificial landmark detection step S22 detects an artificial landmark image among the images acquired by the precise landing control unit in the take-off image acquisition step S21.

In the landing setting step S23, the precision landing control unit determines the magnetic north in the GPS position coordinates based on the artificial landmark specification information in the take-off preparation step S10 and the artificial landmark image acquired in the take-off artificial landmark detection step S20 , The landing direction is determined based on the artificial landmark angle determined by the magnetic north determination of the center coordinate recognition reference.

When the artificial landmark is detected from the artificial landmark image acquired in the take-off artificial landmark detection step S20, it is possible to confirm which direction the magnetic north direction is compared with the artificial landmark information. Based on this, , And can determine the direction of the landing direction.

That is, in order to safely land the UAV, information necessary for landing (such as landing direction) must be acquired before the flight mission is performed.

In the mission execution step S30, the unmanned airplane performs the flight mission based on the flight mission information in the take-off preparation step S10.

The artificial landmark tracking step S40 is a step in which after the unmanned airplane performs the flight mission, the precision landing control unit moves the unmanned airplane to the corresponding GPS coordinates based on the GPS information of the take- To obtain the landing point image and to detect the artificial landing image among the acquired landing point image.

That is, the unmanned airplane is moved to the GPS coordinates of the landing point, the surrounding image is photographed, and the surrounding image is photographed until the image that can detect the artificial landmark is acquired. Since GPS information is error-prone, an artificial landmark must be detected to locate and land the correct landing point.

The artificial landmark-based descent step S50 synchronizes the position of the airplane with the precision altitude calculation height based on the artificial landmark image detected in the artificial landmark tracking step S40, (S20) based on the artificial landmark image detected in step S40.

The detection of the artificial landing means that the landing point has been identified and the unmanned airplane should be moved to the landing point.

As shown in FIG. 5, the artificial landmark-based descent step S50 may include a position synchronization step S51 and an angle synchronization step S52.

In the position synchronization step S51, the precision landing control unit calculates the precise position of the artificial landmark center coordinate based on the artificial landmark image detected in the artificial landmark tracking step S40, and synchronizes the position of the airplane based on this.

That is, the approach information for moving the unmanned airplane to land on the coordinates of the artificial landmark is calculated. The access information can be used to calculate rotation angle processing information for angle extraction between an unmanned aerial vehicle and an artificial landmark for approaching an artificial landmark at an altitude of 10 m or more. The formula for determining the direction and angle from a reference point to a specific point in an image can be obtained using a trigonometric function (atan). Since the image is processed in units of pixels, it is processed based on the pixels. The reference point can be defined as the center of the bottom of the image, and the center coordinates of the desired object in the image are found to obtain the difference from the reference pixel.

The angle synchronization step S52 synchronizes (aligns angles) with the landing direction of the automatic take-off step S20 until the precise landing control unit reaches a predetermined accurate altitude calculation height, and synchronizes the position of the airplane.

In the case of a square artificial landmark, for example, it is possible to obtain an angle that is different from the landing direction of the unmanned aerial vehicle based on the rotation angles of the four corners. This is because similar landing trajectories can be safely landed on unmanned airplanes.

In the dual base descent step S60, the precision landing control unit calculates the accurate altitude using the ultrasonic sensor 200 from the accurate altitude calculation height to a predetermined landing standby altitude, and calculates the altitude based on the artificial landmark image and the accurate altitude. Synchronize position and landing direction.

At this time, it is possible to calculate the error position in the center coordinates of the artificial landmark based on the artificial landmark image and the precision altitude, and it is possible to check how far the unmanned airplane is located in which direction.

That is, it is possible to induce the UAV to move from the central coordinate point (the vertical position from the landing point to the landing point) by outputting the control information capable of moving the UAV to the center coordinates according to the center coordinates and the position information of the UAV.

In the landing step (S70) based on the ultrasonic sensor, the precision landing control unit vertically descends the unmanned airplane based on the precision altitude value using the ultrasonic sensor (200) from the predetermined landing waiting height.

Generally, images acquired within 50 Cm are difficult to use due to unfocused images. In this case, the coordinates of the unmanned airplane do not fluctuate so much, and the ultrasonic sensor 200 is used to securely land on the ground The speed of the UAV can be adjusted based on the precision altitude value.

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

100:
200: Ultrasonic sensor
300: Precise landing control unit
310: take-off information storage unit 320: artificial landmark detection unit
330: Information management unit 340: Access information calculation unit
350: center coordinate calculation unit 360: precision altitude calculation unit
S10: Steps to take off
S20: Automatic takeoff phase
S21: Take-in image acquisition step S22: Take-off artifact detection step
S23: Landing setting step
S30: Mission stage
S40: Artificial marker tracking step
S50: Artificial marker based descent phase
S51: Position synchronization step S52: Angle synchronization step
S60: Dual-based descent phase
S70: Ultrasonic sensor based landing stage

Claims (6)

delete delete delete A take-off preparation step (S10) in which the precise landing control unit receives the artificial marker standard information including the center coordinate information and the flight mission information from the outside and acquires the GPS information of the take-off position;
The landing position image is acquired using the image acquisition unit 100 and the precision landing control unit acquires the take-off position image and the artificial landmark information of the take-off preparation step (S10) An automatic take-off step (S20) of determining a landing direction to the landing direction;
A mission performing step (S30) in which the unmanned airplane performs a flight mission based on the flight mission information in the take-off preparation step (S10);
After the unmanned airplane performs the flight mission, the precise landing control unit moves the unmanned airplane to the corresponding GPS coordinates based on the GPS information of the taking-ready step (S10), acquires the landing point image using the image acquisition unit 100 An artificial landmark tracking step (S40) of detecting an artificial landmark image among the obtained landing point images;
The precise landing control unit synchronizes the position of the airplane up to a predetermined precise altitude calculation height based on the artificial landmark image detected in the artificial landmark tracking step S40 and detects the artificial landmark image detected in the artificial landmark tracking step S40 An artificial landmark-based descent step (S50) for synchronizing with the landing direction of the automatic take-off step (S20) on the basis of the landing direction;
The precision landing control unit calculates the accurate altitude using the ultrasonic sensor 200 from the altitude calculation altitude to a predetermined landing altitude, and synchronizes the position and landing direction of the airplane based on the artificial landmark image and the altitude Based descent step S60; And
An ultrasound sensor-based landing step (S70) in which the precision landing control unit vertically descends the unmanned airplane based on the precision altitude value using the ultrasonic sensor (200) from a predetermined landing waiting height;
Lt; / RTI >
The automatic take-off step (S20)
A take-off image acquiring step (S21) of acquiring an image of a take-off position using the image acquiring unit (100);
A take-off artificial landmark detection step (S22) in which the precise landing control unit detects an artificial landmark image among the images acquired in the take-in image acquisition step (S21); And
The precise landing control unit determines the magnetic north in the GPS position coordinates based on the artificial landmark specification information in the take-off preparation step S10 and the artificial landmark image obtained in the take-off artificial land detection step S22, A landing setting step (S23) of determining a landing direction based on an artificial landmark angle determined by the decision;
Lt; / RTI >
The artificial landmark-based descent step (S50)
A precise landing control unit (S51) of calculating a precise position of an artificial landmark center coordinate based on the artificial landmark image detected in the artificial landmark tracking step (S40) and synchronizing the position of the airplane on the basis of the accurate position; And
An angle synchronization step (S52) of synchronizing the position of the airplane with the landing direction of the automatic take-off step (S20) until the precise landing control unit reaches a predetermined accurate altitude calculation height;
Lt; / RTI >
The dual-based descent step (S60)
Based on the artificial landmark image and the precision altitude, it is possible to calculate the error position in the center coordinates of the artificial landmark according to the unmanned airplane, to check how far away the unmanned airplane is in the direction of the unmanned airplane, It is possible to induce the unmanned airplane to move to the central coordinate point by outputting the control information capable of moving the unmanned airplane to the center coordinates,
The ultrasonic sensor-based landing step (S70)
A method for precise landing of an unmanned airplane using an artificial marker and an ultrasonic sensor that adjusts the speed of the unmanned airplane based on the precise altitude value using the ultrasonic sensor (200) so that it can land safely without causing a large impact on the ground.
delete delete
KR1020150082648A 2015-06-11 2015-06-11 Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor KR101733677B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150082648A KR101733677B1 (en) 2015-06-11 2015-06-11 Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150082648A KR101733677B1 (en) 2015-06-11 2015-06-11 Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor

Publications (2)

Publication Number Publication Date
KR20160146062A KR20160146062A (en) 2016-12-21
KR101733677B1 true KR101733677B1 (en) 2017-05-08

Family

ID=57735083

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150082648A KR101733677B1 (en) 2015-06-11 2015-06-11 Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor

Country Status (1)

Country Link
KR (1) KR101733677B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102357299B1 (en) 2020-11-02 2022-02-07 주식회사 두시텍 Operating Methods of Unmanned Aircraft Using a Precision Landing System of an Unmanned Aircraft capable of Reproductive Correction of RTK
KR102357302B1 (en) 2020-11-02 2022-02-07 주식회사 두시텍 Mobile UAV Precision Landing Control Unit with Relative Calibration of RTK

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102100606B1 (en) * 2018-09-11 2020-04-16 한국전력공사 System for landing a drone and operating method thereof
KR102160734B1 (en) * 2018-09-19 2020-09-28 김현철 Image taking system using wireless rechargeable drones
CN112882489A (en) * 2021-01-12 2021-06-01 深圳市慧明捷科技有限公司 Unmanned aerial vehicle data acquisition system based on big data
KR102448233B1 (en) * 2021-12-13 2022-10-04 주식회사 유시스 Drone controlling method for precise landing

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101157484B1 (en) * 2010-12-14 2012-06-20 주식회사 대한항공 Uav automatic recovering method
JP2012232654A (en) * 2011-04-28 2012-11-29 Topcon Corp Taking-off and landing target device, and automatic taking-off and landing system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101146119B1 (en) 2005-12-21 2012-05-16 재단법인 포항산업과학연구원 Method and apparatus for determining positions of robot
KR101105737B1 (en) 2009-12-11 2012-01-17 충북대학교 산학협력단 Apparatus and method for recognizing position of mobile robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101157484B1 (en) * 2010-12-14 2012-06-20 주식회사 대한항공 Uav automatic recovering method
JP2012232654A (en) * 2011-04-28 2012-11-29 Topcon Corp Taking-off and landing target device, and automatic taking-off and landing system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102357299B1 (en) 2020-11-02 2022-02-07 주식회사 두시텍 Operating Methods of Unmanned Aircraft Using a Precision Landing System of an Unmanned Aircraft capable of Reproductive Correction of RTK
KR102357302B1 (en) 2020-11-02 2022-02-07 주식회사 두시텍 Mobile UAV Precision Landing Control Unit with Relative Calibration of RTK

Also Published As

Publication number Publication date
KR20160146062A (en) 2016-12-21

Similar Documents

Publication Publication Date Title
KR101733677B1 (en) Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor
CN105335733B (en) Unmanned aerial vehicle autonomous landing visual positioning method and system
CN109823552B (en) Vision-based unmanned aerial vehicle accurate landing method, storage medium, device and system
US20160253808A1 (en) Determination of object data by template-based uav control
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
US9728094B2 (en) Redundant determination of positional data for an automatic landing system
CN106408601B (en) A kind of binocular fusion localization method and device based on GPS
WO2019182521A1 (en) Autonomous taking off, positioning and landing of unmanned aerial vehicles (uav) on a mobile platform
CN107240063A (en) A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform
CN108156819B (en) Method for calculating the distance from an aircraft to a ground target
JP2012071645A (en) Automatic taking-off and landing system
US11490005B2 (en) Overhead line image capturing system and overhead line image capturing method
KR20200064542A (en) Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof
CN108549376A (en) A kind of navigation locating method and system based on beacon
CA3073034A1 (en) Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets
US8315748B2 (en) Altitude measurement apparatus and method
CN105388908A (en) Machine vision-based unmanned aerial vehicle positioned landing method and system
CN110488848B (en) Unmanned aerial vehicle vision-guided autonomous landing method and system
JP2019178998A (en) Position identification system
CN107438863B (en) Flight positioning method and device
JP2011112556A (en) Search target position locating device, method, and computer program
US20140078146A1 (en) Flight obstacle extraction device, flight obstacle extraction method, and recording medium
CN112119428A (en) Method, device, unmanned aerial vehicle, system and storage medium for acquiring landing position
CN107576329B (en) Fixed wing unmanned aerial vehicle landing guiding cooperative beacon design method based on machine vision
CN111796605A (en) Unmanned aerial vehicle landing control method, controller and unmanned aerial vehicle

Legal Events

Date Code Title Description
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)