KR101617411B1 - Method and system for controlling a drone - Google Patents

Method and system for controlling a drone Download PDF

Info

Publication number
KR101617411B1
KR101617411B1 KR1020150115643A KR20150115643A KR101617411B1 KR 101617411 B1 KR101617411 B1 KR 101617411B1 KR 1020150115643 A KR1020150115643 A KR 1020150115643A KR 20150115643 A KR20150115643 A KR 20150115643A KR 101617411 B1 KR101617411 B1 KR 101617411B1
Authority
KR
South Korea
Prior art keywords
user terminal
image
drones
screen
camera
Prior art date
Application number
KR1020150115643A
Other languages
Korean (ko)
Inventor
박시몽
Original Assignee
박시몽
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 박시몽 filed Critical 박시몽
Priority to KR1020150115643A priority Critical patent/KR101617411B1/en
Application granted granted Critical
Publication of KR101617411B1 publication Critical patent/KR101617411B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • H04M1/72533
    • B64C2201/146

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to a method and a system for controlling a drone. The system for controlling a drone comprises a user terminal and a drone. According to the user terminal, a drone image photographed by a first camera and a predetermined pattern are displayed on a screen, and in the drone, the screen of the user terminal is photographed by a second camera to calculate a moving direction and a moving distance in accordance with a size of a pattern displayed in the photographed image and a position of the drone image. According to the present invention, the drone can be easily controlled to be moved to a desired position with a simple method by a user.

Description

TECHNICAL FIELD The present invention relates to a drone control method and system,

The present invention relates to a dron control method and system that enables a user to steer a dron to a desired position with a simple operation.

A drones is a radio-controlled flight device, which is also called a multi-copter in the sense of a flight with multiple propellers. Since the drones were developed for military use in the early 20th century, powerful nations such as the United States have been developing it competitively and have been used in actual combat so far. Recently, as the drone is commercialized, the drone is used in various fields such as camera shooting.

The drone is a small unmanned aerial vehicle and it is general that the operation signal of the operator is received by radio and operated manually. However, it is quite difficult for a beginner to steer the user to move the drones to a specific location. As the spread of small drones is spreading, there is a growing possibility that various inconveniences and accidents will occur due to inexperienced users.

Patent Registration No. 10-1527210

SUMMARY OF THE INVENTION Accordingly, it is an object of the present invention to provide a dron control method and system capable of easily maneuvering a dron to a desired position by a simple operation.

According to an aspect of the present invention, there is provided a drones control system including a first camera, a user terminal for displaying a dron image photographed by the first camera and a predetermined pattern on a screen, A second camera for photographing a screen of the user terminal, and a controller for calculating a moving direction and a moving distance according to the size of the pattern and the position of the dron image displayed in the first image captured by the second camera Containing drones.

The pattern may be a form capable of calculating the width or height of the screen of the user terminal.

The second camera captures the second and third images, respectively, by photographing the user terminal that is distant from the first and second distances, and the controller uses the size of the pattern represented in the second and third images And calculate the distance to the user terminal.

The control unit calculates the horizontal movement rotation angle? X of the drones

Figure 112015079547080-pat00001
Wherein Wx is the length from the centerline of the first image to the edge of the pattern appearing in the first image and the a is from the centerline of the first image to the drones appearing in the first image And A may be 1/2 of the angle of view of the first camera.

According to another embodiment of the present invention, a dron, which is controlled by a user terminal that displays a dron image captured by the first camera and a predetermined pattern on the screen, includes a first camera, And a controller for calculating a moving direction and a moving distance according to the size of the pattern and the position of the dron image in the first image captured by the second camera.

A user terminal for controlling a dron according to another embodiment of the present invention includes a first camera for photographing the drones and a processing unit for displaying a predetermined pattern on the photographed drones image, A second camera for photographing the user terminal, and a controller for calculating a moving direction and a moving distance according to the size of the pattern and the position of the dron image, which are displayed in the first image captured by the second camera.

According to another embodiment of the present invention, there is provided a method of controlling a user terminal, the method comprising: displaying an image of the drones taken by the user terminal and a predetermined pattern on a screen; Capturing a screen to generate a first image, and calculating a moving direction and a moving distance according to the size of the pattern and the position of the dron image displayed in the first image.

Capturing the user terminal where the drones are separated by first and second distances to obtain a second and a third image, respectively, and using the size of the pattern represented in the second and third images, And calculating a distance to the terminal.

The moving direction calculating step may calculate the horizontal moving rotational angle? X of the drones

Figure 112015079547080-pat00002
Wherein Wx is the length from the centerline of the first image to the edge of the pattern appearing in the first image and the a is represented within the first image from the centerline of the first image, And A is a half of the angle of view of the first camera.

A method for controlling a user terminal rodron having a screen displayed on a screen with a dron image photographed by a camera and a predetermined pattern according to another embodiment of the present invention includes the steps of photographing a screen of the user terminal by the drones 1 image, and calculating a moving direction and a moving distance according to the size of the pattern and the position of the dron image appearing in the first image.

According to another embodiment of the present invention, there is provided a method of controlling a user terminal rodron, comprising the steps of photographing the drones, and displaying a predetermined pattern and the photographed drones on a screen, A moving image and a moving distance are calculated according to the size of the pattern and the position of the dron image in the first image.

According to the drone control method and system of the present invention, the user can easily control the drone to a desired position by a simple operation. In addition, since a smartphone used by anyone can be used as a user terminal, no separate equipment is required, which is advantageous in terms of cost.

1 is a schematic diagram of a dron system in accordance with an embodiment of the present invention.
Figure 2 is a block diagram of the drones shown in Figure 1;
3 is a block diagram of the user terminal shown in FIG.
4 is a schematic diagram illustrating a method for controlling a drones in accordance with one embodiment of the present invention.
5 is a flowchart illustrating a drones control method according to an embodiment of the present invention.
6 is an exemplary diagram illustrating various patterns displayed on the user terminal shown in FIG.
7 is a schematic diagram showing an initial setting method of a dron system according to an embodiment of the present invention.
8 is a schematic diagram showing the results according to the method shown in Fig.
9 is a schematic diagram illustrating computing the distance between a drones and a user terminal in accordance with one embodiment of the present invention.
10 is a schematic view illustrating a change of a screen according to a position of a dron according to an embodiment of the present invention.
11 is a schematic diagram illustrating calculation of the angle of rotation between a drones and a user terminal in accordance with one embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention.

FIG. 1 is a schematic diagram of a drone system according to an embodiment of the present invention, FIG. 2 is a block diagram of the drone shown in FIG. 1, and FIG. 3 is a block diagram of the user terminal shown in FIG.

1 and 2, a drone system according to an embodiment of the present invention includes a drone 100 and a user terminal 200. The drone 100 includes a control unit 110, a communication unit 120, A driver 130, a camera module 150,

The controller 110 calculates a target position and a direction of the drones 100 according to a control signal of the user terminal 200 or according to a predetermined operation of the controller 200 and generates a corresponding control command to the driver 140 .

The communication unit 120 receives a control signal for the user to control the drones 100 from the user terminal 200 and transmits the control signal to the control unit 110. The communication unit 120 transmits the captured image from the camera module 150 to the user terminal 200 send. The communication method between the communication unit 120 and the user terminal 200 is not limited to wireless communication such as Wi-Fi, Bluetooth, and WiBro.

The sensor unit 130 includes an acceleration sensor or a gyro sensor, measures the acceleration and the rotation angle of the dron 100, and transmits the measured acceleration and the rotation angle to the control unit 110.

The driving unit 140 includes a plurality of motors and propellers. The driving unit 140 receives a control command from the control unit 110 so that the drone 100 can fly, and drives the motor to rotate the propeller.

The camera module 150 includes a camera for photographing a ground or a public object, and has a structure capable of adjusting the pan, tilt, or zoom of the camera. The camera module 150 receives a signal related to camera shooting, pan, tilt, or zoom adjustment from the user terminal 200, and can photograph an image and adjust the pan, tilt, or zoom. Alternatively, the camera module 150 may control camera shooting, pan, tilt, or zoom through the control unit 110, and the controller 110 may control the position and orientation of the drones 100 and the pan, tilt, So that the user can easily obtain desired images while operating the user's drones 100 in a convenient manner.

The camera module 150 captures a certain pattern image with the drones 100 displayed on the screen of the user terminal 200 and the control unit 110 analyzes the pattern images to generate movement control commands for the drones 100. [

3, the user terminal 200 includes a processing unit 210, a communication unit 220, an input unit 230, a display unit 240, and a camera 250. The user terminal 200 may be a smart phone or a drone dedicated pilot terminal, or both.

The processing unit 210 performs an operation of executing an instruction and generating or using data. For example, the processing unit 210 may process input and output data between the components of the user terminal 200, interpret input information such as touch information received from the input unit 230, , And display the image photographed by the camera 250 on the display unit 240. [

The communication unit 220 communicates with the drones 100 and receives the image data captured by the drones 100 from the drones 100 and transmits the control signals of the user terminals 200 to the drones 100.

The input unit 230 transmits data to the user terminal 200 according to a user's input. The input unit 230 may be used for inputting characters or for instructing the start of an application provided by the user terminal 200 or a selection related to a graphical user interface (GUI) on the display unit 240. [ Can be used to do. The input unit 230 may include a touch pad or a touch screen for generating touch information according to a user's touch and transmitting the information to the processing unit 210. The input unit 230 may be coupled to the display unit 240 And may be integrated with the display unit 240. On the other hand, the input unit 230 may include a switch or a button for adjusting the volume in addition to the touch pad or the touch screen.

The display unit 240 outputs various kinds of information to the screen and may include, for example, a liquid crystal display or an organic light emitting diode. The display unit 240 may display a graphical user interface (GUI). The graphical user interface provides an interface that facilitates a user to easily use an application running on the user terminal 200.

The display unit 240 may display a predefined pattern image to activate the movement control operation of the drones 100. The drones 100 taken by the camera 250 for the movement control commands of the drones 100 It can display the embedded image.

The camera 250 is provided so as to be able to shoot toward the front of the user terminal 200 and can display an image including the drones 100 on the display unit 240 by photographing the drones 100.

A method of controlling the drone will now be described with reference to FIG. 4 is a schematic diagram illustrating a method for controlling a drones in accordance with one embodiment of the present invention.

Referring to FIG. 4, the user can direct the screen of the user terminal 200 to the camera module 150 of the drones 100 such that the screen of the user terminal 200 faces the drones 100. The control unit 110 of the drones 100 acquires an image of a specific pattern appearing on the screen of the user terminal 200 through the camera module 150 and generates a command for controlling the movement of the drones 100 .

Meanwhile, the camera 250 of the user terminal 200 photographs the drones 100, and the display unit 240 displays the images of the specific pattern together with the photographed images of the drones 100.

The user can control the drones 100 by moving the user terminal 200 so that the user terminal 200 faces in a direction corresponding to a position where the drones 100 are desired to move. That is, if the user moves the user terminal 200 so that the line connecting the position where the user desires to move the drones 100 and the center of the screen of the user terminals 200 is perpendicular to the screen of the user terminals 200, The image of the drones 100 displayed on the screen of the user terminal 200 is analyzed and the user moves to a desired position.

In this way, the user can move the drones 100 to a desired position only by a simple operation of moving the user terminal 200, so that the drones 100 can be used to take a self-portrait or a movie.

The drones control method will now be described in more detail with reference to Figures 5-11.

FIG. 5 is a flowchart illustrating a drone control method according to an embodiment of the present invention. FIG. 6 is an exemplary view of various patterns displayed on the user terminal shown in FIG. 1, and FIG. FIG. 2 is a schematic view showing an initial setting method of the drone system according to the present invention; FIG. Fig. 8 is a schematic diagram showing the results according to the method shown in Fig. 7, and Fig. 9 is a schematic diagram illustrating calculation of the distance between the drones and the user terminal according to an embodiment of the present invention. FIG. 10 is a schematic view illustrating a change of a screen according to a position of a drone according to an embodiment of the present invention, and FIG. 11 is a diagram illustrating a calculation of a rotation angle between a dron and a user terminal according to an embodiment of the present invention Fig.

First, a pattern image displayed on the user terminal 200 for drones control will be described. This pattern image is predefined so long as it can be recognized by the control unit 110 of the drones 100 and may be any shape in which the width and height of the screen of the user terminal 200 can be calculated. As shown in Fig. 6, for example, a pattern of a rectangular band surrounding the whole screen, (b) a rectangular shape located at the four corners of the screen, (c) an "L" Shape or the like may be used, but the present invention is not limited thereto.

In the drone control method according to an embodiment of the present invention, initial setting for drone control is performed first (S100). The initial setting is to find the constant value used in the travel distance calculation of the drone 100. [

A method of performing the initial setting for the drones control will be described in detail with reference to FIG.

The user terminal 200 is fixed at a predetermined position P0 of the table or floor 10 and a predetermined pattern image is displayed on the screen of the user terminal 200. [ The camera of the drones 100 faces the screen of the user terminal 100 and manually adjusts the position of the drones 100 by moving the drones 100 in a direction perpendicular to the screen of the user terminal 100. [ The entire pattern displayed on the screen of the user terminal 100 is displayed on the image photographed by the camera module 150 of the drones 100 and the drones 100 are placed at the position P1 where the largest pattern appears. The entire pattern does not appear on the image photographed by the camera module 150 of the drone 100 and the entire pattern is displayed if the position is farther from the position P1 but the position The pattern appears small. Therefore, the entire pattern appears at the position P1, and the user terminal 200 and the drone 100 are located at the closest distance (nearest distance, dm).

The controller 110 of the drone 100 calculates the width W and the height H of the pattern in the image photographed at the nearest distance and then stores these values. In this case, the width W and the height H may be absolute values, but they may be relative to each other, such as the number of pixels in the photographed image. That is, these values are merely numbers for calculation and may not be actual lengths. The height H and the width W are used as constants to calculate the distance between the user terminal 200 and the drones 100. 8A shows a height H and a width W of a pattern having a quadrangle at four corners of the screen of the user terminal 200. As shown in FIG.

When the calculation of the width W and the height H is completed, the drones 100 are moved to a position P2 separated from the position P1 by a predetermined distance d. The control unit 110 of the drone 100 calculates the height Hr and the width Wr of the pattern of the screen of the user terminal 200 in the image captured by the camera of the drones 100. [ 8 (b), the imaginary maximum distance D that the camera of the drone 100 can take and recognize the pattern is calculated as shown in the following equation (1) Remember the value. The virtual maximum distance (D) is a constant and is then used in the distance calculation.

[Equation 1]

Figure 112015079547080-pat00003

In this way, the controller 110 of the drone 100 calculates and stores the height H, the width W, and the virtual maximum distance D, thereby completing the initial setting. The initial setting may be performed only once according to the model or the specification of the user terminal 200. If the model or specification of the user terminal 200 is changed, the initial setting is performed again. It is also possible to calculate the height H and the width W of the pattern by photographing at a position farther than the nearest distance dm. In addition, the maximum distance D is calculated using the heights (H, Hr) in Equation (1), but it may be calculated using the widths (W, Wr).

When the initial setting is completed, the drones 100 can perform movement control of the drones 100 using the user terminal 200 while flying in the air. The camera of the drone 100 starts shooting toward the screen of the user terminal 200 and the control unit 110 analyzes the photographed image and displays a predetermined pattern image on the screen of the user terminal 200, ) Of the vehicle.

The control unit 110 of the drones 100 calculates the distance between the user terminal 200 and the drones 100 at step S110. Referring to FIG. 9, the drone 100 hovering at a position Px calculates the height Hx of the pattern displayed on the screen of the user terminal 200 in the image of the screen of the user terminal 200. The distance x can be calculated using the height H already known in the initial setting and the virtual maximum distance D as shown in the following Equation 2. " (2) " This distance x corresponds to the distance between the user terminal 200 and the drones 100.

&Quot; (2) "

Figure 112015079547080-pat00004

When the distance x between the dragon 100 and the user terminal 200 is calculated, the moving direction of the dragon 100 is calculated (S120).

The camera 250 of the user terminal 200 also photographs the drones 100 and displays it on the screen of the user terminal 200 as the camera of the drones 100 photographs the screen of the user terminal 200 . Therefore, the image taken by the camera of the drones 100 includes the drones 100 displayed on the screen of the user terminal 200.

10 is a photograph of the drones 100 located at the respective positions Pa, Pb, and Pc with the user terminal 200 being vertically erected with respect to the ground and the screen and the camera 250 facing to the right And the lower drawing shows that the drones 100 are displayed on the screen of the user terminal 200 according to the respective positions Pa, Pb and Pc of the drones 100. 10 and 11 illustrate the drones 100 displayed on the screen of the user terminal 200 as circular points for the convenience of illustration of the position of the drones 100, Lt; / RTI >

Referring to FIG. 10, when the user terminal 200 is fixed, the position of the drones 100 displayed on the screen of the user terminal 200 changes according to the actual position of the drones 100. When the drones 100 are positioned at the center of the front of the user terminal 200, the drones 100 are displayed at the center of the screen of the user terminal 200, The drones 100 are displayed on the left side of the screen of the user terminal 200. When the drones 100 are positioned on the front right side of the user terminal 200, ) Is displayed.

Also, as the drones 100 move away from the center line of the front of the screen of the user terminal 200, the drones 100 displayed on the screen move away from the center of the screen of the user terminal 200.

The controller 110 of the drones 100 can determine which direction the drones 100 should move in accordance with the position of the drones 100 displayed on the screen of the user terminal 200. [ The control unit 110 of the drones 100 may recognize the body of the drones 100 displayed on the screen of the user terminal 200 and may use the LEDs mounted on the drones 100 or specific markers attached to the bodies have.

The angle of view of the camera 250 of the user terminal 200 is different for each camera model and the position of the drones 100 recognized according to the angle of view is calculated as a ratio of the height and the width of the screen, .

For example, referring to FIG. 11, in the case of a camera having a horizontal angle of view of 2 A, the horizontal angle of rotation? X is calculated based on the distance a from the center as shown in the following equation (3) , The vertical rotation angle? Y is calculated based on the distance b from the center as shown in the following equation (4).

&Quot; (3) "

Figure 112015079547080-pat00005

&Quot; (4) "

Figure 112015079547080-pat00006

The moving direction and the moving angle of the drones 100 are determined using the horizontal rotation angle x and the vertical rotation angle y and then the total movement is performed on the spherical surface having the radius x calculated in Equation (2) The distance is calculated (S130).

Meanwhile, the distance between the drones 100 and the user terminal 200 can be adjusted by using the input unit 230 of the user terminal 200. That is, for example, the distance from the user terminal 200 to the drones 100 may be changed by using a volume control button of the user terminal 200 or a GUI displayed on the touch screen.

As described above, according to the dron control system and method according to the embodiment of the present invention, the user can easily maneuver the dron 100 to a desired position by a simple operation. In addition, various scenes can be photographed more conveniently at the time of self-portrait and moving picture shooting using the drone 100.

While the present invention has been particularly shown and described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, And all changes and modifications to the scope of the invention.

100: drones, 110: control unit,
120, 220: communication unit, 130: sensor unit,
140: driving unit, 150: camera module,
200: user terminal, 210: processor,
230: input unit, 240: display unit,
250: camera

Claims (12)

A user terminal comprising a first camera and displaying a predetermined pattern and a dron image photographed by the first camera on a screen,
And the second camera calculates the size of the pattern displayed in the first image of the screen of the user terminal and is displayed on the screen of the user terminal that is changed when the user moves the user terminal And a controller for calculating a moving direction and a moving distance according to the position of the drones
And a drones.
The method of claim 1,
Wherein the pattern is configured to calculate a width or height of a screen of the user terminal.
The method of claim 1,
The second camera captures the second and third images by capturing the user terminal that is separated by first and second distances, respectively,
Wherein the controller calculates a distance to the user terminal using the size of the pattern represented in the second and third images.
The method of claim 1,
The control unit calculates the horizontal movement rotation angle? X of the drones
Figure 112015079547080-pat00007
Wherein Wx is the length from the centerline of the first image to the edge of the pattern appearing in the first image and the a is from the centerline of the first image to the drones appearing in the first image And A is a half of the angle of view of the first camera.
A dron operated by a user terminal including a first camera, the dron image photographed by the first camera and a predetermined pattern displayed on a screen,
A second camera, and
The second camera calculates the size of the pattern displayed in the first image of the screen of the user terminal and calculates the size of the pattern displayed on the screen of the user terminal, A control unit for calculating the moving direction and the moving distance according to the position
Lt; / RTI >
A user terminal for controlling a drones,
A first camera for photographing the drones, and
And a processor for displaying the photographed drones image and a predetermined pattern on a screen,
The dragon calculates a size of the pattern displayed in the first image captured by the second camera and the second camera, and displays the size of the pattern displayed on the screen of the user terminal, which is changed when the user moves the user terminal And calculating a moving direction and a moving distance according to a position of the dron image
User terminal.
A method for controlling a drone by a user terminal,
Displaying an image of the drones photographed by the user terminal and a predetermined pattern on a screen,
Capturing a screen of the user terminal by the dragon to generate a first image, and
Calculating a size of the pattern represented in the first image and calculating a moving direction and a moving distance according to a position of the dron image displayed on a screen of the user terminal that varies as a user moves the user terminal
≪ / RTI >
8. The method of claim 7,
Wherein the pattern is configured to calculate a width or height of a screen of the user terminal.
8. The method of claim 7,
Capturing the user terminal where the drones are separated by first and second distances to obtain a second and a third image, respectively; and
Calculating a distance to the user terminal using the size of the pattern represented in the second and third images
≪ / RTI >
8. The method of claim 7,
The moving direction calculating step may calculate the horizontal moving rotational angle? X of the drones
Figure 112015079547080-pat00008
Wherein Wx is the length from the centerline of the first image to the edge of the pattern appearing in the first image and the a is represented within the first image from the centerline of the first image, Wherein the distance A is a half of the angle of view of the first camera.
A method of controlling a user terminal rodron having a dron image photographed by a camera and a predetermined pattern displayed on a screen,
Capturing a screen of the user terminal by the dragon to generate a first image, and
Calculating a size of the pattern represented in the first image and calculating a moving direction and a moving distance according to a position of the dron image displayed on a screen of the user terminal that varies as a user moves the user terminal
≪ / RTI >
A method for controlling a drone by a user terminal,
Photographing the drones, and
And displaying on the screen a predetermined pattern and the photographed drones image,
The drones are configured to photograph a screen of the user terminal to generate a first image, to calculate a size of the pattern displayed in the first image, to display the pattern on the screen of the user terminal, The moving direction and the moving distance are calculated according to the position of the dron image
Dron control method.
KR1020150115643A 2015-08-17 2015-08-17 Method and system for controlling a drone KR101617411B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150115643A KR101617411B1 (en) 2015-08-17 2015-08-17 Method and system for controlling a drone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150115643A KR101617411B1 (en) 2015-08-17 2015-08-17 Method and system for controlling a drone

Publications (1)

Publication Number Publication Date
KR101617411B1 true KR101617411B1 (en) 2016-05-18

Family

ID=56113521

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150115643A KR101617411B1 (en) 2015-08-17 2015-08-17 Method and system for controlling a drone

Country Status (1)

Country Link
KR (1) KR101617411B1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018016793A1 (en) * 2016-02-26 2018-01-25 (주)스마트모션 Drone control device using composite sensor and method therefor
KR20180060403A (en) * 2016-11-29 2018-06-07 문창근 Control apparatus for drone based on image
KR20180065796A (en) 2016-12-08 2018-06-18 동국대학교 산학협력단 Method for controlling drone of using speech recognition, apparatus and system for executing the same
KR101896239B1 (en) * 2017-09-20 2018-09-07 (주)텔트론 System for controlling drone using motion capture
KR101897597B1 (en) 2017-06-19 2018-09-13 주식회사 스카이텍 Remote controller connection system for drone control
KR102122752B1 (en) * 2019-10-23 2020-06-15 주식회사 네스앤텍 Unmanned vehicle control method for improving dropping accuracy
KR20200077910A (en) 2018-12-21 2020-07-01 주식회사 스카이텍 A drone system for line installation by using remote controll right transfer
KR20200132038A (en) 2019-05-15 2020-11-25 주식회사 스카이텍 Dron system for power line inspection that can intercept electromagnetic wave
KR102250490B1 (en) 2020-08-24 2021-05-13 에스큐엔지니어링(주) System for the measurement of strength and physical parameters of concrete structures using impulse forces applied by hitting instrument mounted on drones
KR20220063651A (en) 2020-11-10 2022-05-17 금오공과대학교 산학협력단 Image-based active marker module and system for estimating 6D pose of robot, and marker recognition method using the same
KR20230016730A (en) 2021-07-26 2023-02-03 주식회사 제이슨랩 An automatic landing system to guide the drone to land precisely at the landing site
US11883761B2 (en) 2020-11-12 2024-01-30 Universal City Studios Llc System and method for interactive drone experience

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101502275B1 (en) * 2014-04-11 2015-03-13 중앙대학교 산학협력단 Automatically Driven Control Apparatus for non people helicopters and Control Method the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101502275B1 (en) * 2014-04-11 2015-03-13 중앙대학교 산학협력단 Automatically Driven Control Apparatus for non people helicopters and Control Method the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
항공우주산업기술동향 7권2호 (pp.115-120)*

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018016793A1 (en) * 2016-02-26 2018-01-25 (주)스마트모션 Drone control device using composite sensor and method therefor
KR20180060403A (en) * 2016-11-29 2018-06-07 문창근 Control apparatus for drone based on image
KR20180065796A (en) 2016-12-08 2018-06-18 동국대학교 산학협력단 Method for controlling drone of using speech recognition, apparatus and system for executing the same
KR101897597B1 (en) 2017-06-19 2018-09-13 주식회사 스카이텍 Remote controller connection system for drone control
KR101896239B1 (en) * 2017-09-20 2018-09-07 (주)텔트론 System for controlling drone using motion capture
KR20200077910A (en) 2018-12-21 2020-07-01 주식회사 스카이텍 A drone system for line installation by using remote controll right transfer
KR20200132038A (en) 2019-05-15 2020-11-25 주식회사 스카이텍 Dron system for power line inspection that can intercept electromagnetic wave
KR102122752B1 (en) * 2019-10-23 2020-06-15 주식회사 네스앤텍 Unmanned vehicle control method for improving dropping accuracy
KR102250490B1 (en) 2020-08-24 2021-05-13 에스큐엔지니어링(주) System for the measurement of strength and physical parameters of concrete structures using impulse forces applied by hitting instrument mounted on drones
KR20220063651A (en) 2020-11-10 2022-05-17 금오공과대학교 산학협력단 Image-based active marker module and system for estimating 6D pose of robot, and marker recognition method using the same
US11883761B2 (en) 2020-11-12 2024-01-30 Universal City Studios Llc System and method for interactive drone experience
KR20230016730A (en) 2021-07-26 2023-02-03 주식회사 제이슨랩 An automatic landing system to guide the drone to land precisely at the landing site

Similar Documents

Publication Publication Date Title
KR101617411B1 (en) Method and system for controlling a drone
US11797009B2 (en) Unmanned aerial image capture platform
US20210276731A1 (en) Methods and systems for movement control of flying devices
US9886033B2 (en) System for piloting a drone in immersion
JP4012749B2 (en) Remote control system
JP2019507924A (en) System and method for adjusting UAV trajectory
KR100931029B1 (en) Unmanned aerial vehicle control method for aerial photography
CN107065894B (en) Unmanned aerial vehicle, flying height control device, method, and computer-readable recording medium
US10580216B2 (en) System and method of simulating first-person control of remote-controlled vehicles
CN109997091B (en) Method for managing 3D flight path and related system
US20200097026A1 (en) Method, device, and system for adjusting attitude of a device and computer-readable storage medium
KR20170090888A (en) Apparatus for unmanned aerial vehicle controlling using head mounted display
WO2020233682A1 (en) Autonomous circling photographing method and apparatus and unmanned aerial vehicle
KR101617383B1 (en) Method for controlling take-off of a drone and drone employing the same
KR101682797B1 (en) Apparatus for tangible control of unmanned aerial vehicle and Method for control thereof
KR20180025416A (en) Drone flying control system and method using motion recognition and virtual reality
KR101600699B1 (en) Flight recording system and operating method thereof
WO2020209167A1 (en) Information processing device, information processing method, and program
KR101615739B1 (en) Drone and method for measuring distance from user terminal to drone
WO2023097918A1 (en) Method for monitoring unmanned aerial vehicle, and terminal and readable storage medium
WO2022056683A1 (en) Field of view determination method, field of view determination device, field of view determination system, and medium
WO2022188151A1 (en) Image photographing method, control apparatus, movable platform, and computer storage medium
WO2018045654A1 (en) Method and system for displaying state of mobile device and control device
CN114641744A (en) Control method, apparatus, system, and computer-readable storage medium
KR20200005504A (en) Dron flight practicing system

Legal Events

Date Code Title Description
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190422

Year of fee payment: 4