KR101758093B1 - Apparatus and method for controlling unmanned aerial vehicle - Google Patents

Apparatus and method for controlling unmanned aerial vehicle Download PDF

Info

Publication number
KR101758093B1
KR101758093B1 KR1020150132831A KR20150132831A KR101758093B1 KR 101758093 B1 KR101758093 B1 KR 101758093B1 KR 1020150132831 A KR1020150132831 A KR 1020150132831A KR 20150132831 A KR20150132831 A KR 20150132831A KR 101758093 B1 KR101758093 B1 KR 101758093B1
Authority
KR
South Korea
Prior art keywords
uav
unmanned airplane
user
position information
detected
Prior art date
Application number
KR1020150132831A
Other languages
Korean (ko)
Other versions
KR20170034503A (en
Inventor
이상준
신상희
권휘중
신소라
이소연
Original Assignee
숭실대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 숭실대학교산학협력단 filed Critical 숭실대학교산학협력단
Priority to KR1020150132831A priority Critical patent/KR101758093B1/en
Publication of KR20170034503A publication Critical patent/KR20170034503A/en
Application granted granted Critical
Publication of KR101758093B1 publication Critical patent/KR101758093B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • H04M1/72533
    • B64C2201/127
    • B64C2201/141
    • B64C2201/145
    • B64C2201/146

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A unmanned aircraft control system and method are disclosed.
The unmanned airplane control system includes an unmanned airplane and a mobile terminal. When the GPS position information of the unmanned airplane can be acquired from the unmanned airplane, the mobile terminal receives the flight control signal based on the GPS position information of the unmanned airplane, And if the GPS position information of the UAV can not be obtained from the UAV, the UAV can control the flight of the UAV by analyzing the image captured by the UAV.

Description

[0001] APPARATUS AND METHOD FOR CONTROLLING UNMANNED AERIAL VEHICLE [0002]

The present invention relates to an unmanned aerial vehicle control system and method, and more particularly, to an unmanned aerial vehicle control system and method capable of automatically tracking a user and avoiding an obstacle on a flight path.

In recent years, high-performance flying drone has attracted attention, and its application range is expanding to aerial photography, broadcasting image production, aviation structure, logistics delivery, surveillance, surveying, anti-damping, and military use.

However, in the case of a conventional unmanned airplane, a pilot who controls the unmanned airplane is separately required to track the object to be shot by the unmanned airplane or to avoid an obstacle in the flight path of the unmanned airplane.

Also, the conventional unmanned airplane uses the GPS position information of the unmanned airplane to control the unmanned airplane flight. However, it is difficult to control the flight of the unmanned airplane because the location information of the unmanned airplane can not be accurately detected in a region where the GPS reception sensitivity is poor such as in the room.

Accordingly, it is possible to track the object to be photographed actively on an unmanned airplane without having a separate pilot, to automatically identify and avoid obstacles in the unmanned airplane's flight path, and to fly in a region where the GPS reception sensitivity is low. It is necessary to have an unmanned aircraft control system that can control flight.

One aspect of the present invention is to control the flight of an unmanned airplane using the position information of the unmanned airplane if it is possible to detect the position information of the unmanned airplane in the unmanned airplane, And an unmanned airplane control system and method for controlling a flight of an unmanned airplane by analyzing an image.

According to one aspect of the present invention, an unmanned aerial vehicle control system sets an operation mode to either a GPS-based tracking mode or a photographing image-based tracking mode depending on whether GPS position information is detected, It controls the flight by receiving the flight control signal based on the GPS position information from the mobile terminal. When the operation mode is set to the image-based tracking mode, the image captured by the camera is analyzed, And transmits the flight control signal based on the GPS position information to the unmanned airplane using the GPS position information of the unmanned airplane, .

Wherein the unmanned airplane detects a user in the photographed image when the operation mode of the unmanned airplane is set to a photographed image based tracking mode and determines whether the user is within a predetermined proper existence range within the photographed image Thereby controlling the flight of the UAV.

Controlling the flight of the unmanned airplane according to whether or not the unmanned airplane is located within the predetermined proper range of existence allows the unmanned airplane to control the flight of the unmanned airplane when the user is not located within the pre- Detects the direction in which the user is located, and controls the rotation of the UAV according to the direction in which the user is located.

The detection of the direction in which the user is located may include detecting the location of the user in the photographed image and detecting the location of the user in the presence of the detected center Direction can be detected.

Wherein the unmanned airplane analyzes the photographed image after rotating according to the direction in which the user is located and confirms whether the user is out of the preset proper range due to the movement of the user, Detecting a direction in which the user departs from the proper existence range, and controlling the unmanned air vehicle to rotate according to a direction in which the detected user departs.

Wherein the unmanned airplane detects a user on the photographed image when the operation mode of the UAV is set to a photographed image based tracking mode and determines whether the user is larger or smaller than a preset size in the photographed image If the user is determined to be larger than the predetermined size, stop the running of the unmanned airplane or control the unmanned airplane to move backward, and if the user is confirmed to be smaller than the predetermined size, the unmanned airplane is advanced Can be controlled.

Wherein the mobile terminal receives the GPS position information of the UAV from the UAV, detects GPS position information of the mobile terminal, and uses the GPS position information of the UAV and the GPS position information of the mobile terminal, The distance and azimuth angle between the UAV and the mobile terminal can be calculated and the flight control signal based on the GPS position information can be generated using the distance and the azimuth angle between the UAV and the mobile terminal.

The mobile terminal generates the flight control signal based on the GPS position information by checking whether the distance between the unmanned airplane and the mobile terminal is equal to or greater than a preset distance, It is possible to generate the flight control signal including the forward signal if the distance is equal to or greater than the predetermined distance.

The mobile terminal generates the flight control signal based on the GPS position information by receiving the current azimuth information of the unmanned airplane from the unmanned airplane and comparing the azimuth angle between the unmanned airplane and the mobile terminal and the current azimuth of the unmanned airplane The difference value of the azimuth can be calculated and the flight control signal including the rotation control signal for controlling the rotation of the UAV can be generated according to the calculated difference value.

Wherein the unmanned airplane checks whether an obstacle is present in front of the unmanned airplane through an obstacle detection sensor provided in advance and generates an obstacle map for the vicinity of the unmanned airplane when an obstacle exists in front of the unmanned airplane, Detecting an avoidance path through which the unmanned airplane can pass through the map, and controlling the unmanned airplane to fly according to the avoiding path.

Wherein the control unit is configured to control the flying of the UAV in accordance with the avoidance path and to check whether or not an obstacle is detected in front of the UAV and if the obstacle is not detected in front of the UAV, And switching to the tracking mode to control the unmanned aerial vehicle.

Controlling the operation of the unmanned airplane by switching the operation mode of the unmanned airplane from the obstacle avoidance mode to the tracking mode is performed by checking whether the GPS position information of the unmanned airplane is detected and when the GPS position information of the unmanned airplane is detected, Based tracking mode, and if the GPS position information of the unmanned airplane is not detected, it is determined whether the user is detected by analyzing a surrounding image photographed at a current position of the unmanned airplane, If the user is not detected, controls the unmanned airplane to move within a predetermined radius of the surrounding area until GPS position information of the unmanned airplane is detected Detecting GPS position information of the unmanned airplane, After detecting the GPS location information it can control the UAV in accordance with the GPS based tracking mode.

According to an aspect of the present invention, there is provided a control method for an unmanned airplane, comprising: checking whether GPS information of the unmanned airplane is detected in an unmanned airplane; determining whether an operation mode of the unmanned airplane is a GPS- Based tracking mode, and when the operation mode of the UAV is set to the GPS-based tracking mode, the mobile terminal sets the GPS position information of the UAV and the GPS position information of the mobile terminal And controls the flight of the UAV according to the generated flight control signal. When the operation mode of the UAV is set to the captured image based tracking mode, the image captured by the camera is analyzed to control the flight of the UAV do.

According to an aspect of the present invention, a user can be automatically traced without a separate pilot, and when the position information of the unmanned airplane can not be acquired, the position of the unmanned airplane is predicted by analyzing the image captured by the unmanned airplane It is possible to automatically track the user even when the position information of the unmanned airplane can not be obtained. By detecting the avoidance path which can avoid the obstacle and detect the obstacle, the obstacle around the unmanned airplane can be detected automatically, It is possible to automatically capture and track the user without colliding with the user.

FIG. 1 is a diagram illustrating an unmanned aerial vehicle control system according to an embodiment of the present invention. Referring to FIG.
2 is a block diagram of the UAV shown in FIG.
3 is a detailed block diagram of the unmanned aerial vehicle control unit shown in FIG.
FIG. 4 is a view for explaining a method of analyzing an image by the image analysis unit shown in FIG. 3 to recognize a user.
FIG. 5 is a view for explaining a method of controlling the UAV according to the movement of the user by the flight control unit shown in FIG. 3. FIG.
6A and 6B are diagrams for explaining a method of controlling the UAV according to the distance between the user and the UAV;
7 is a view for explaining a method of detecting obstacles existing in front of the UAV.
8 is a diagram for explaining a method of detecting the avoidance path by the avoidance path detecting unit shown in FIG.
FIG. 9 is a view for explaining a method of controlling the flight of the UAV according to the avoidance path detected through the avoidance path detecting unit. FIG.
10 is a flowchart illustrating an unmanned airplane control method according to an embodiment of the present invention.
FIG. 11 is a flowchart illustrating a method for controlling an unmanned aerial vehicle when the operation mode of the UAV is set to the GPS-based tracking mode in FIG.
12A and 12B are flowcharts illustrating a method for controlling an unmanned air vehicle when the operation mode of the UAV is set to the tracking mode based on the captured image in FIG.
13A and 13B illustrate a method for controlling an unmanned airplane when the operation mode of the UAV is set to the obstacle avoidance mode and a method for controlling the UAV when the operation mode of the UAV is switched from the obstacle avoidance mode to the tracking mode Fig.
14 is a block diagram of the mobile terminal shown in FIG.
FIG. 15 is a view for explaining a method of operation of the distance calculating unit shown in FIG. 14. FIG.
16A and 16B are flowcharts illustrating a control method of a mobile terminal according to an embodiment of the present invention.

The following detailed description of the invention refers to the accompanying drawings, which illustrate, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, certain features, structures, and characteristics described herein may be implemented in other embodiments without departing from the spirit and scope of the invention in connection with an embodiment. It is also to be understood that the position or arrangement of the individual components within each disclosed embodiment may be varied without departing from the spirit and scope of the invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is to be limited only by the appended claims, along with the full scope of equivalents to which such claims are entitled, if properly explained. In the drawings, like reference numerals refer to the same or similar functions throughout the several views.

Hereinafter, preferred embodiments of the present invention will be described in more detail with reference to the drawings.

FIG. 1 is a diagram illustrating an unmanned aerial vehicle control system according to an embodiment of the present invention. Referring to FIG.

The UAV control system 1 according to an embodiment of the present invention includes an unmanned airplane 100 and a mobile terminal 100 that can control the flight of the unmanned airplane 100, (200).

The UAV 100 can control the flight by the mobile terminal 200 using the GPS position information or can control the flight based on the image captured by the camera provided in advance.

Specifically, when the UAV 100 can obtain the GPS position information, the UAV 100 transmits the GPS position information of the UAV 100 to the mobile terminal 200, receives the flight control signal from the mobile terminal 200 It is possible to control the flight of the UAV 100. In the case where the UAV 100 is located at a position where GPS reception sensitivity is low as in a room and can not acquire GPS position information, the UAV 100 detects a user from a video image captured by a camera provided in advance, The control unit 100 controls the flight of the UAV 100 according to the detected movement of the user, detects the size of the user in the captured image, and controls the flight of the UAV 100 according to the detected size can do.

The mobile terminal 200 can receive position information of the UAV 100 from the UAV 100. The mobile terminal 200 can detect the position of the mobile terminal 200 when the position information of the UAV 100 is received from the UAV 100. The mobile terminal 200 may calculate the distance between the mobile terminal 200 and the UAV 100 using the location information of the UAV 100 and the location information of the mobile terminal 200, ) And the UAV 100 can be calculated. The mobile terminal 200 can generate the flight control signal of the UAV 100 according to the calculated distance and azimuth angle and generate the flight control signal for controlling the flight of the UAV 100 according to the generated flight control signal The control signal can be transmitted to the UAV 100.

FIG. 2 is a block diagram of the UAV shown in FIG. 1, FIG. 3 is a detailed block diagram of the UAV controller shown in FIG. 2, and FIG. FIG. 5 is a view for explaining a method of controlling the UAV according to the movement of a user, and FIGS. 6A and 6B are diagrams for explaining a method for controlling the UAV, FIG. 7 is a view for explaining a method of detecting an obstacle existing in front of an unmanned airplane, and FIG. 7 is a view for explaining a method for controlling an unmanned airplane according to the distance between the user and the unmanned airplane. FIG. 8 is a diagram for explaining a method of detecting the avoidance path by the avoidance path detecting unit shown in FIG. 3, and FIG. According to the diagram shown to explain a method of controlling the flight of the UAV.

Referring to FIG. 2, the UAV 100 according to an embodiment of the present invention includes an unmanned aerial vehicle communication unit 110, a photographing unit 120, a sensor unit 130, an unmanned airplane control unit 140, (150).

The unmanned airplane communication unit 110 may wirelessly communicate with the mobile terminal 200 to transmit and receive predetermined information to and from the mobile terminal 200 and acquire GPS position information of the unmanned airplane 100 using a GPS module can do.

At this time, the UAV communication unit 110 can transmit the position information of the UAV 100 to the mobile terminal 200 through wireless communication such as WiFi, and can receive the flight control signal from the mobile terminal 200.

The photographing unit 120 can photograph the periphery of the UAV 100 using a camera provided on the UAV 100. The photographing unit 120 may transmit the photographed image to the control unit 140 and may transmit the photographed image to the mobile terminal 200 through the unmanned air communication unit 110. [

The sensor unit 130 detects obstacles in the vicinity of the UAV 100 using an ultrasound sensor or an infrared sensor provided in advance and controls the distance between the UAV 100 and the obstacle in the presence of an obstacle in front of the UAV 100. [ The distance can be detected. When the sensor unit 130 detects that there is an obstacle in front of the UAV 100, the sensor unit 130 may detect the distance between the UAV 100 and the obstacle and transmit an obstacle detection signal to the UAV 140.

The sensor unit 130 may detect an obstacle 360 ° around the UAV 100 on the basis of the UAV 100 as well as the front of the UAV 100. For this purpose, . In this case, when a plurality of obstacle detection sensors are provided, a rotation member capable of rotating the obstacle detection sensor may be further provided to the UAV 100. The obstacle detection sensor rotates 360 degrees through the rotation member, 100) around the obstacle can be detected.

The sensor unit 130 may include an azimuth measurement sensor and may measure the azimuth angle of the UAV 100 and transmit the measured azimuth to the mobile terminal 200.

The unmanned airplane control unit 140 can control the overall operation of the unmanned airplane. Specifically, when the obstacle is detected by the sensor unit 130, the unmanned airplane control unit 140 avoids the obstacle and transmits the unmanned airplane to the unmanned airplane, The avoidance path can be detected. For this purpose, the unmanned aerial vehicle control unit 140 may include a mode setting unit 141, an image analyzing unit 142, an avoidance path detecting unit 143, and a flight control unit 144.

The mode setting unit 141 can determine whether the GPS position information of the UAV 100 is acquired through the UAV communication unit 110. [ The mode setting unit 141 may set the operation mode of the UAV 100 to the GPS based tracking mode when the GPS position information of the UAV 100 is acquired. If the GPS position information of the UAV 100 is not acquired, the mode setting unit 141 may set the operation mode of the UAV 100 to the captured image based tracking mode.

When the obstacle detection signal is received from the sensor unit 130, the mode setting unit 141 may compare the distance information between the UAV 100 included in the obstacle detection signal and the detected obstacle to a predetermined distance, The operation mode of the UAV 100 may be set to the obstacle avoidance mode if the distance to the obstacle 100 and the detected obstacle is less than a predetermined distance.

Hereinafter, a method in which the UAV 100 is controlled for each case in which the operation mode of the UAV 100 is set to the GPS-based tracking mode, the photographing image-based tracking mode, or the obstacle avoidance mode will be described.

When the operation mode of the UAV 100 is set to the GPS-based tracking mode by the mode setting unit 141, the flight control unit 144 transmits the GPS position information of the UAV 100 to the unmanned airplane communication unit 110 To the mobile terminal 200. The flight control unit 144 receives the flight control signal according to the GPS position information of the UAV 100 transmitted from the mobile terminal 200 and controls the flight of the UAV 100 according to the received flight control signal have.

When the operation mode of the UAV 100 is set to the captured image based tracking mode by the mode setting unit 141, the image analyzed by the image analyzing unit 142 is transmitted to the UAV 100 through the camera And the flight control unit 144 can control the UAV 100 based on the analyzed image information.

Specifically, the image analysis unit 142 may analyze the image captured through the camera provided in the UAV 100 and detect the user in the captured image. At this time, the detection of the user in the photographed image may be to detect all or a part of the user's body such as the whole body or face of the user. On the other hand, a method of detecting a user's whole body or face in a photographed image is disclosed in detail in Korean Patent Laid-Open Publication No. 2014-0042024 and Korean Laid-Open Patent Publication No. 2003-0040680.

When the user is detected in the photographed image, the image analysis unit 142 determines whether the detected user matches the direction in which the UAI 100 is photographed and the direction in which the user is located, It can be confirmed whether or not it is located within the existing range. The image analyzing unit 142 may detect the direction information on the position where the user is present if it is confirmed that the user is not located within the predetermined existence range preset in the image. The detecting of the direction information on the position where the user is present may include detecting the coordinates of the center point of the predetermined correct range on the image, detecting the position coordinates of the user on the image, The direction of the user may be detected by calculating the direction of the user based on the center point.

The flight control unit 144 may rotate the UAV 100 according to the detected direction information on the position where the user is present. At this time, the flight control unit 144 controls the operation of the UAV 100 such that the direction of the UAV 100 is aligned with the detected direction of the camera, which is previously installed in the UAV 100, .

In addition, the image analyzing unit 142 rotates the UAV 100 to control the user to be positioned within a preset predetermined range within the image, and then continuously determines whether or not the user is out of the preset proper range within the image . The image analyzing unit 142 can detect a direction deviating from a predetermined proper existence range by the user if it is confirmed that the user is out of the preset proper existence range. At this time, the detection of a direction deviating from a predetermined proper existence range by the user is performed by extracting the image frame immediately before the user departs from the predetermined proper existence range by confirming the previously photographed image, The user's position in the image frame immediately before the user departs from the predetermined proper existence range and the user's position information in the image frame after the user out of the preset proper existence range are used to extract the image frame after the user departs Direction. For example, referring to FIGS. 4 and 5, the image analysis unit 142 determines whether a user located within a predetermined proper existence range A, as shown in FIG. 4, is out of a preset proper existence range A The user's position information a and b after departing from the predetermined proper existence range A and the user's position information a 'and b' after leaving the preset appropriate existence range A And the direction information (" A "

Figure 112015091432100-pat00001
Can be detected.

When it is determined by the image analysis unit 142 that the user is out of the preset appropriate existence range A, the flight control unit 144 controls the flight control unit 142 based on the direction information of the user, which is detected by the image analysis unit 142, (100) can be rotated. At this time, the flight control unit 144 may rotate the UAV 100 so that the shooting direction of the camera provided in advance in the UAV 100 or in the UAV 100 coincides with the detected direction of the user have.

On the other hand, the preset proper range may be set according to a preset proper ratio to the camera frame size of the UAV 100 or may be set by a user's input.

The image analyzing unit 142 analyzes the image captured by the camera provided in the UAV 100 in order to check whether the UAV 100 is capturing while maintaining a certain distance from the user, It is possible to detect whether the size is larger or smaller than a preset size. The comparison of the size of the user with the preset size may be performed by detecting a part of the body of the user such as the whole body or the face of the user in the photographed image and detecting the whole body of the user or a part of the body of the user Size or greater than the size. For example, referring to FIGS. 6A and 6B, the image analyzing unit 142 detects a user's face in the photographed image, and determines whether the detected user's face size is larger or smaller than a preset size B Can be detected. The image analyzing unit 142 can transmit the flight control signal to the flight control unit 144 when it is detected that the detected whole body of the user or a part of the user's body is larger or smaller than a predetermined size in the image.

If the flight control unit 144 detects that the entire body of the user or a part of the user's body detected by the image analysis unit 142 is larger than a predetermined size in the image, the current unmanned airplane 100 approaches the user It is possible to control the operation of the UAV 100 to stop or reverse. When it is detected that the whole body of the user detected by the image analysis unit 142 or a part of the body of the user is smaller than a preset size in the image, the flight control unit 144 determines that the current UAV 100 is farther And controls the UAV 100 to advance.

After the flight control unit 144 controls the flight of the UAV 100, the image analysis unit 142 determines whether the UAV 100 is being shot at a certain distance from the user The image of the UAV 100 can be analyzed after the flight of the UAV 100 is controlled. After the flight of the UAV 100 is controlled, the image analysis unit 142 analyzes the image captured by the camera, re-detects whether the user is larger or smaller than a predetermined size in the image, By controlling the flight of the UAV 100 through the flight control unit 144, it is possible to control the UAV 100 to perform shooting while maintaining a certain distance from the user.

When the operation mode of the UAV 100 is set to the obstacle avoiding mode by the mode setting unit 141, the avoidance path detecting unit 143 detects obstacles existing around the UAV 100, Avoiding path that can be avoided and can fly can be detected.

Specifically, when the sensor unit 130 detects that an obstacle is present in front of the UAV 100, the avoidance path detecting unit 143 detects the obstacle in order to prevent the UAV 100 from colliding with an obstacle, ) To stop the vehicle. The avoidance path detecting unit 143 may detect the obstacle information for 360 degrees around the UAV 100 through the sensor unit 130 after stopping the operation of the UAV 100. [ The avoidance path detecting unit 143 can generate an obstacle map using the obstacle information for 360 degrees around the detected unmanned air vehicle 100. At this time, the obstacle map may include the existence direction information and the distance information about the obstacle at 360 ° around the UAV 100 with the UAV 100 as a center point. The avoidance path detecting unit 143 can check whether or not there is a avoidance path that the unmanned airplane 100 can pass through within 180 占 of the front in the generated obstacle map. When the avoidance path detecting unit 143 detects that there is a avoidance path that the unmanned airplane 100 can pass through within the forward 180 ° in the generated obstacle map, And transmits the signal to the flight control unit 144. 7 to 9, if it is detected that there is an obstacle at a position distant from the UAV 100 by a predetermined distance d1 or less, the avoidance route detector 143 detects the obstacle ahead of the UAV 100, It is possible to detect the avoidance path that the unmanned airplane 100 can pass at 180 ° forward as shown in FIG. The flight control unit 144 can control the flight of the UAV 100 according to the avoided route as shown in FIG.

When the avoidance path detecting unit 143 detects that there is no avoidance path that the unmanned air vehicle 100 can pass through within 180 ° of the generated obstacle map, And can transmit the flight control signal including the direction information on the avoidance path detected in the rear 180 ° to the flight control unit 144. [

The detection of the avoidance path that the unmanned airplane 100 can pass through may be to detect a path that the unmanned airplane 100 can pass through in the obstacle map, reflecting the previously stored unmanned airplane size information.

In order to detect an obstacle around the UAV 100, it is checked whether an obstacle is present in the direction of the UAV 100 while the UAV 100 is flying. There is a possibility that an obstacle may exist in the front 180 ° when an obstacle is sensed. However, since the rear part is a route passing through the UAV 100, it is very unlikely that there is no escape path that the UAV 100 can pass within 180 ° of the rear low. However, in the case of an exception, if the avoidance route does not exist in both the front and rear directions, it is possible to stop the running of the UAV 100 (wait at the corresponding position) or to control the UAV 100 to land.

The flight control unit 144 can receive the flight control signal including the information on the avoidance path from the avoidance path detecting unit 143. [ The flight control unit 144 can control the flight so that the UAV 100 moves according to the direction information of the avoidance path included in the flight control signal.

The flight control unit 144 may control the operation of the unmanned airplane 100 through the sensor unit 130 in order to switch the operation mode of the UAV 100 from the obstacle avoidance mode to the tracking mode depending on whether or not the obstacle avoidance is completed It can be confirmed whether or not it is detected. The flight control unit 144 may switch the operation mode of the UAV 100 to the tracking mode if no obstacle is detected in front of the UAV 100. [ When an obstacle is detected in front of the UAV 100, the flight control unit 144 controls the flight control unit 144 to continuously move in accordance with the direction information of the avoidance route included in the flight control signal until an obstacle is not detected in front of the UAV 100 can do. When it is determined that no obstacle is detected in front of the UAV 100 and the operation control mode of the UAV 100 is changed from the obstacle avoidance mode to the tracking mode, It is possible to determine whether GPS position information of the UAV 100 is detected at the current position in order to determine whether to control the UAV 100 according to one of the modes. When the GPS position information of the UAV 100 is detected at the current position, the flight control unit 144 can control the operation of the UAV 100 according to the GPS-based tracking mode described above. If the GPS position information of the UAV 100 is not detected at the current position, the flight control unit 144 controls the UAV 100 to photograph all the 360 ° around the UAV 100 using the camera provided in advance have. The flight control unit 144 analyzes the photographed image of the surroundings of the UAV 100 and confirms whether or not the user is detected in the vicinity of the UAV 100. [ When the user is detected on the photographed image of the unmanned airplane 100 in a situation where GPS position information of the unmanned airplane 100 is not detected, the flight control unit 144 controls the unmanned airplane 100 Can be controlled. When the user is not detected in the photographed image of the unmanned airplane 100 in the circumstance where the GPS position information of the UAV 100 is not detected, the flight control unit 144 detects the GPS position information of the UAV 100 , It is possible to control the unmanned airplane 100 to move within a certain radius based on the location where the current unmanned airplane 100 is located. At this time, the flight control unit 144 may control the UAV 100 to move in accordance with a predetermined movement pattern, or may control to move the UAV 100 indefinitely within a predetermined radius. The flight control unit 144 can detect the GPS position information of the UAV 100 while moving within a predetermined radius of the UAV 100. [ The flight control unit 144 may detect the GPS position information of the UAV 100 and then control the operation of the UAV 100 according to the GPS-based tracking mode.

Meanwhile, when the user can not be tracked through the GPS position information and the shot image of the UAV 100, the predetermined radius around the UAV 100 may be moved to detect the GPS position information of the UAV 100, And the moving radius is wide enough to detect the GPS position information of the UAV 100. [

Hereinafter, an unmanned airplane control method according to an embodiment of the present invention will be described with reference to FIG.

First, when the power of the UAV 100 is applied by the user, it is determined whether the GPS position information of the UAV 100 is detected through the UAV communication unit 110 (310).

At this time, when it is confirmed that the GPS position information of the UAV 100 is detected 310, the operation mode of the UAV 100 is set to the GPS-based tracking mode 320, Before performing the operation, it is checked whether an obstacle is detected in front of the UAV 100 (330).

If it is determined that the GPS position information of the UAV 100 is not detected 310, the operation mode of the UAV 100 is set to the captured image based tracking mode 340, It is checked whether an obstacle is detected in front of the UAV 100 before performing the automatic user tracking operation (350).

In order to confirm whether or not an obstacle is detected in front of the UAV 100, it is determined whether an obstacle is detected within a predetermined distance forward of the UAV 100 through the sensor unit 130 provided in the UAV 100 . ≪ / RTI >

When it is confirmed that an obstacle is detected in the front of the UAV 100, the operation mode of the UAV 100 is changed from the tracking mode to the obstacle avoidance mode at step 360.

When it is confirmed that an obstacle is not detected in front of the UAV 100 (330, 350), the operation is controlled according to the set tracking mode, and an obstacle is detected through the sensor unit 130 during operation according to the tracking mode The operation mode of the UAV 100 may be switched from the tracking mode to the obstacle avoidance mode.

A method of controlling the UAV 100 according to the set mode will be described later with reference to FIGS.

FIG. 11 is a flowchart illustrating a method for controlling an unmanned aerial vehicle when the operation mode of the UAV is set to the GPS-based tracking mode in FIG.

First, when the operation mode of the UAV 100 is set to the GPS-based tracking mode, the GPS position information of the UAV 100 is transmitted 410 to the mobile terminal 200, (420) and controls the flight of the UAV 100 according to the received flight control signal (430).

At this time, the flight control signal received from the mobile terminal 200 may include a signal for advancing the UAV 100, a forward stop signal, or a rotation control signal.

12A and 12B are flowcharts illustrating a method for controlling an unmanned air vehicle when the operation mode of the UAV is set to the tracking mode based on the captured image in FIG.

12A, in order to control the UAV 100 so that the user can be photographed in accordance with an appropriate photographing angle, an image captured in real time through the photographing unit 120 provided in the UAV 100 is analyzed To detect a user (510).

At this time, the detection of the user in the photographed image may be a detection of the whole body of the user or a part of the body such as the face of the user in the photographed image.

If it is determined that the detected user is not present within the predetermined proper existence range within the image (520), if the user is confirmed not to exist within the predetermined proper existence range within the image, the direction in which the user exists is detected 530).

In this case, the direction in which the user is present in the image may be detected by detecting a direction with respect to a position where the user is present based on a center point of a preset appropriate range of existence.

(530) a direction of the user located outside the preset proper range within the image, and rotates the UAV 100 according to the detected direction (540).

At this time, rotating the UAV 100 according to the detected direction may be to rotate the detected direction so that the front direction of the UAV 100 (the photographing direction of the camera provided on the UAV 100) have.

After the unmanned airplane is rotated 540, the user checks whether the unmanned airplane 100 is out of the predetermined range set in the captured image in real time so as to control the UAV 100 according to the movement of the user ).

At this time, if it is confirmed that the user is out of the preset proper range (550), the user detects (560) a direction deviating from a preset proper range and returns to the step of rotating the UAV according to the detected direction.

Referring to FIG. 12B, in order to control the UAV 100 so that the UAV 100 can fly at a proper distance from the user, the UAV 100 can be controlled in real time And detects the user (610).

At this time, the detection of the user in the photographed image may be a detection of the whole body of the user or a part of the body such as the face of the user in the photographed image.

In order to check whether the UAV 100 is close to or farther away from the user, it is checked whether the detected user is larger or smaller than a predetermined size in the image (620).

At this time, if it is confirmed that the user detected in the image is larger or smaller than the preset size in the image (620), it is determined that the current UAV 100 is close to the user by a predetermined distance or more, (630). ≪ / RTI >

At this time, the control of the flight of the UAV 100 is performed when it is determined that the UAV 100 is closer to the user than a predetermined distance, If the user is determined to be smaller than a predetermined size in the image, the controller 100 determines that the current UAV 100 is farther away from the user than the UAV 100, Can be controlled to advance.

FIG. 13A illustrates a method for controlling an unmanned airplane when the operation mode of the UAV is set to the obstacle avoidance mode, and a method for controlling the UAV when the operation mode of the UAV is switched from the obstacle avoidance mode to the tracking mode. It is a flowchart.

Referring to FIG. 13A, when the operation mode of the UAV 100 is set to the obstacle avoidance mode, the flight of the UAV 100 is stopped to prevent collision with the obstacle due to the movement of the UAV 100, .

After the unmanned airplane 100 is stopped 710, an obstacle map is generated 720 around the unmanned airplane 100 in a state where the unmanned airplane 100 is stopped.

At this time, generating the obstacle map for the vicinity of the UAV 100 may be for generating an obstacle map in which the direction information of the obstacle and the distance information about the 360 degrees around the UAV 100 appear.

In the generated obstacle map, it is checked whether an avoidance path exists within 180 [deg.] Forward of the UAV 100 (730).

In this case, it is possible to confirm whether or not the avoidance route exists within 180 占 of the UAV 100 in the obstacle map by reflecting the size information of the UAV 100 in advance, ° to determine whether there is a path through which the UAV 100 can pass.

When it is confirmed that the avoidance path exists within 180 ° of the unmanned airplane 100 in operation 730, the unmanned airplane 100 controls the flight of the unmanned airplane 100 so that the unmanned airplane 100 moves according to the existing avoidance path.

When it is confirmed that the avoidance route does not exist within 180 ° of the UAV 100, the avoidance route is detected 750 within the rear 180 ° of the UAV 100 in the obstacle map, (740) the flight of the UAV 100 so that the UAV 100 moves according to the avoidance path.

After controlling the flight of the UAV 100 according to the avoidance path, it is determined whether or not an obstacle is detected in front of the UAV 100 to check whether or not the state of the UAV 100 is in a state capable of completely avoiding the obstacle and switching to the current tracking mode (760).

At this time, if an obstacle is not detected in front of the UAV 100, it is determined that it is possible to switch to the current tracking mode, and the operation mode of the UAV 100 is switched from the obstacle avoidance mode to the tracking mode (770).

13B is a view illustrating a state in which the UAV 100 can be controlled according to a tracking mode of the GPS-based tracking mode or the photographing image-based tracking mode when the operation mode of the UAV 100 is switched from the obstacle avoidance mode to the tracking mode. (810) whether or not the GPS position information of the UAV 100 is detected. If the GPS position information of the UAV 100 is not detected at this time, 100), and checks whether the user is detected in the vicinity of the UAV 100 at step 820. In step 820,

At this time, when the user is detected in the vicinity of the UAV 100, the UAV 100 is set to the captured image based tracking mode so that the UAV 100 performs the automatic user tracking on the basis of the captured image, (830).

If the user is not detected in the vicinity of the UAV 100, the UAV 100 detects the GPS position information of the UAV 100 while moving within a predetermined radius of the UAV 100, And sets the mode to the GPS-based tracking mode to control the UAV 100 (850).

At this time, the movement of the UAV 100 within a predetermined radius around the UAV 100 may be performed according to a predetermined movement pattern until the GPS position information of the UAV 100 is detected, To move.

FIG. 14 is a block diagram of the mobile terminal shown in FIG. 1, and FIG. 15 is a diagram illustrating a method of operating the distance calculator shown in FIG.

The mobile terminal 200 included in the unmanned airplane control system 1 according to an embodiment of the present invention receives GPS position information of the UAV 100 from the UAV 100 and transmits the GPS position information of the UAV 100 to the UAV 100. [ The user can control the flight for automatic tracking of the UAV 100 by using the GPS position information of the UAV 100. To this end, the mobile terminal 200 includes a mobile terminal communication unit 210, a user input unit 220, an output unit 230, a mobile terminal memory unit 240, an interface unit 250, a mobile terminal control unit 260, And a supply unit 270.

The mobile terminal communication unit 210 may include one or more components for performing wireless communication between the mobile terminal 200 and the UAV 100.

For example, the mobile communication unit 210 may include a mobile communication module 211, a wireless Internet module 212, a short range communication module 213, and a location information module 214.

The mobile communication module 211 transmits and receives a radio signal of at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception. The mobile communication module 211 can receive the position information of the UAV 100 from the UAV 100 and transmit the flight control signal generated by the UAV 100 to the UAV 100 have. In addition, the mobile communication module 211 can receive the azimuth information of the UAV 100 from the UAV 100.

The wireless Internet module 212 is a module for wireless Internet access. The wireless Internet module 212 may be internal or external.

The short-range communication module 213 is a module for short-range communication. The local area communication technology may include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), and Zigbee communication.

The location information module 214 is a module for confirming the location of the mobile terminal 200 or obtaining the location of the mobile terminal 200. For example, there is a GPS (Global Position System) module. The GPS module receives position information from a plurality of satellites. The location information may include coordinate information indicated by latitude and longitude. For example, the GPS module can accurately calculate the current position according to the triangulation method by measuring the precise time and distance from three or more satellites and measuring three different distances. A method of obtaining distance and time information from three satellites and correcting the error by one satellite may be used. In particular, the GPS module can obtain latitude, longitude and altitude as well as three-dimensional velocity information and accurate time from the position information received from the satellite.

The user input unit 220 generates input data for a user to control operation of the terminal. The user input unit 220 may include a key pad, a dome switch, a touch pad, a jog wheel, a jog switch, and the like. Particularly, when the touch pad has a mutual layer structure with a display unit provided in an output unit 230, which will be described later, it can be called a touch screen.

The user input unit 220 can receive the control signal of the UAV 100 from the user and can transmit the control signal to the UAV 100 through the mobile communication unit 210 when the control signal of the UAV 100 is input from the user. have.

The output unit 230 is for outputting an audio signal, a video signal, or an alarm signal, and may include a display unit, an audio output module, and the like.

The mobile terminal memory unit 240 may store a program for processing and controlling the mobile terminal controller 260 and may perform a function for temporarily storing input / output data. In particular, the mobile terminal memory unit 240 according to the present embodiment may store information for automatically tracking a user by the UAV 100, and may include predetermined distance information between the UAV 100 and the user .

The interface unit 250 serves as an interface with all external devices connected to the mobile terminal 200. For example, it can be connected to a device with a wired / wireless headset port, an external rechargeable battery port, a wired / wireless data port, a memory card port, an identification module, an audio I / O port, An input / output port, an earphone port, and the like.

Here, the identification module may include a user authentication module, a subscriber authentication module, a general-purpose user authentication module, and the like as chips for storing various information for authenticating the usage right of the mobile terminal 200. The device equipped with the identification module can be manufactured in the form of a smart card.

The mobile terminal control unit 260 may control the overall operation of the mobile terminal 200 and may generate a control signal for controlling the UAV 100. [ For this, the mobile terminal control unit 260 may include a distance calculating unit 261, an azimuth calculating unit 262, and a flight control signal generating unit 263.

The distance calculating unit 261 calculates the distance between the UAV 100 and the mobile terminal 200 using the GPS position information of the UAV 100 and the GPS position information of the mobile terminal 200 received from the UAV 100 Can be calculated. 15, if the GPS coordinate information of the mobile terminal 200 owned by the current user is (x1, y1) and the GPS coordinate information of the UAV 100 is (x2, y2) The calculating unit 261 calculates

Figure 112015091432100-pat00002
The distance between the UAV 100 and the mobile terminal 200 can be calculated according to the formula.

The flight control signal generating unit 263 compares the distance between the UAV 100 and the mobile terminal 200 and the predetermined distance calculated by the distance calculating unit 261 to calculate the distance between the UAV 100 and the mobile terminal 200 200 is greater than or equal to a predetermined distance. At this time, the predetermined distance means the most suitable distance for photographing the user with the camera installed in advance on the UAV 100, and can be set by the user. The flight control signal generator 263 may generate a telegraph signal when the distance between the UAV 100 and the mobile terminal 200 is determined to be equal to or greater than a predetermined distance and the UAV 100 ).

In addition, the mobile terminal 200 according to an embodiment of the present invention transmits a forward signal to the UAV 100, and then transmits the forward signal to the UAV 100 so that the UAV 100 can continuously check whether the UAV 100 is within a predetermined distance from the user The position information of the UAV 100 can be received from the UAV 100 that has moved according to the signal. The distance calculating unit 261 may calculate the distance between the UAV 100 and the mobile terminal 200 according to the position information of the UAV 100. The flight control signal generator 263 compares the re-calculated distance between the re-calculated UAV 100 and the mobile terminal 200 and a predetermined distance to calculate a distance between the re-calculated UAV 100 and the mobile terminal 200, It is possible to confirm whether or not the distance is equal to or greater than a predetermined distance, and generate the flight control signal of the UAV 100 according to the result. If it is determined that the distance between the UAV 100 and the mobile terminal 200 is less than the predetermined distance, the flight control signal generator 263 may generate the flight control signal including the forward stop signal, The signal can be transmitted to the UAV 100. If it is determined that the distance between the UAV 100 and the mobile terminal 200 is equal to or greater than the predetermined distance, the flight control signal generator 263 may determine that the distance between the UAV 100 and the mobile terminal 200 is less than a predetermined distance The distance between the UAV 100 and the mobile terminal 200 can be confirmed.

The azimuth angle calculator 262 calculates the azimuth angle between the UAV 100 and the mobile terminal 200 using the GPS position information of the UAV 100 and the GPS position information of the mobile terminal 200 received from the UAV 100, Can be calculated. For example, if the GPS coordinate information of the mobile terminal 200 owned by the current user is (x1, y1) and the GPS coordinate information of the UAV 100 is (x2, y2), the distance calculating unit 261

Figure 112015091432100-pat00003
The azimuth angle between the UAV 100 and the mobile terminal 200 can be calculated according to the equation.

The flight control signal generator 263 may generate a flight control signal for rotating the UAV 100 according to the azimuth angle between the UAV 100 and the mobile terminal 200 calculated by the azimuth calculator 262 have. Specifically, the flight control signal generator 263 may calculate the azimuth angle between the UAV 100 and the mobile terminal 200 and the azimuth angle of the UAV 100 received from the UAV 100. The flight control signal generating unit 263 generates a flight control signal to rotate the UAV 100 in accordance with the calculated difference value so that the azimuth angle between the UAV 100 and the mobile terminal 200 matches the azimuth of the UAV 100, Signal can be generated. The flight control signal generator 263 can transmit the generated rotation control signal to the UAV 100 through the mobile communication unit 210. [

The power supply unit 270 receives external power and internal power under the control of the mobile terminal control unit 260 and supplies power necessary for operation of the respective components.

16A and 16B are flowcharts illustrating a control method of a mobile terminal according to an embodiment of the present invention.

16A, the position information of the UAV 100 is received (910) from the UAV 100 so that the UAV 100 can fly within a suitable distance from the user, The distance between the UAV 100 and the mobile terminal 200 is calculated using the position information of the UAV 100 and the position information of the mobile terminal 200 in operation 930.

The distance between the UAV 100 and the mobile terminal 200 is compared with a preset distance to check whether the distance between the UAV 100 and the mobile terminal 200 is equal to or greater than a predetermined distance.

At this time, when it is confirmed that the distance between the UAV 100 and the mobile terminal 200 is equal to or greater than a preset distance, it is determined that the UAV 100 does not fly within a proper distance from the user, And transmits a forward signal to the UAV 100 (950).

The forward signal is transmitted to the UAV 100 at step 950 so that the UAV 100 advances and then re-receives the position information of the UAV 100 from the UAV 100 at step 960, And the distance between the mobile terminal 200 and the mobile terminal 200 is calculated (970).

At this time, it is checked whether the distance between the UAV 100 and the mobile terminal 200 is equal to or greater than a predetermined distance (980). If the distance between the UAV 100 and the mobile terminal 200 is equal to or greater than a preset distance, When the distance between the unmanned airplane 100 and the mobile terminal 200 is less than a preset distance, the control unit 100 controls the unmanned airplane 100 100) is located within an appropriate distance from the user and transmits a forward stop signal to the UAV 100 (990).

16B, the position information of the UAV 100 and the azimuth information of the UAV 100 are received from the UAV 100 so that the UAV 100 can fly within a suitable distance from the user 1010 The position information of the AT 200 and the position information of the AT 200 and the position information of the AT 200 and the position information of the AT 200 and the position information of the AT 200, The azimuth angle is calculated (1030).

The difference value between the azimuth angle between the calculated UAV 100 and the mobile terminal 200 and the azimuth angle of the UAV 100 is calculated 1040 and the UAV 100 is calculated based on the calculated difference (1050) a rotation control signal including a difference value, and transmits the generated rotation control signal to the UAV 100 (1060).

Such a technique for controlling an unmanned aerial vehicle can be implemented in an application or can be implemented in the form of program instructions that can be executed through various computer components and recorded on a computer-readable recording medium. The computer-readable recording medium may include program commands, data files, data structures, and the like, alone or in combination.

The program instructions recorded on the computer-readable recording medium may be ones that are specially designed and configured for the present invention and are known and available to those skilled in the art of computer software.

Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.

Examples of program instructions include machine language code such as those generated by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware device may be configured to operate as one or more software modules for performing the processing according to the present invention, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes and modifications may be made therein without departing from the spirit and scope of the invention as defined in the appended claims. It will be possible.

1: Unmanned aircraft control system
100: Unmanned aircraft
200: mobile terminal

Claims (13)

Based tracking mode or a photographed image-based tracking mode according to whether or not the GPS position information is detected, and when the operation mode is set to the GPS-based tracking mode, the operation mode is set based on the GPS position information from the mobile terminal An unmanned airplane for controlling a flight by receiving a flight control signal and analyzing an image photographed by a camera provided in advance when the operation mode is set to a photographing image based tracking mode; And
A mobile terminal for generating a flight control signal based on the GPS position information using the GPS position information of the UAV and transmitting a flight control signal based on the GPS position information to the UAV,
Wherein the unmanned airplane checks whether an obstacle is present in front of the unmanned airplane through an obstacle detection sensor provided in advance and generates an obstacle map for the vicinity of the unmanned airplane when an obstacle exists in front of the unmanned airplane, And a controller for controlling the flight of the unmanned airplane according to the avoidance path to check whether or not an obstacle is detected in front of the unmanned airplane and detecting an obstacle in front of the unmanned airplane, Further comprising switching the operation mode of the unmanned airplane from the obstacle avoidance mode to the tracking mode to control the unmanned airplane if it is not detected,
Controlling the operation of the unmanned airplane by switching the operation mode of the unmanned airplane from the obstacle avoidance mode to the tracking mode is performed by checking whether the GPS position information of the unmanned airplane is detected and when the GPS position information of the unmanned airplane is detected, Based tracking mode, and if the GPS position information of the unmanned airplane is not detected, it is checked whether the user is detected by analyzing the surrounding image photographed at the current position of the unmanned airplane, If the user is not detected, controls the unmanned airplane to move within a certain radius until the GPS position information of the unmanned airplane is detected, Detecting the GPS position information of the unmanned airplane, After detecting the location information for controlling the unmanned aircraft in accordance with the GPS based tracking mode Unmanned aircraft control system.
The method according to claim 1,
In the unmanned air vehicle,
Wherein when the operation mode of the UAV is set to the photographing image based tracking mode, the user is detected in the captured image, and if the user is within the predetermined proper existence range within the captured image, Unmanned aircraft control system controlling flight.
[Claim 3 is abandoned upon payment of the registration fee.] 3. The method of claim 2,
Controlling the flight of the unmanned airplane according to whether the unmanned airplane is located within the predetermined predetermined range of existence of the user,
Wherein the control unit detects the direction in which the user is located within the photographed image and controls the rotation of the UAV according to a direction in which the user is located if the user is not located within the predetermined proper existence range.
[Claim 4 is abandoned upon payment of the registration fee.] The method of claim 3,
Wherein the unmanned aircraft detects the direction in which the user is located,
Wherein the controller detects a position of the user in the photographed image and detects a direction of a position where the user is present based on a center of the predetermined proper existence range.
[Claim 5 is abandoned upon payment of registration fee.] The method of claim 3,
In the unmanned air vehicle,
Wherein the controller analyzes the photographed image after rotating according to the direction in which the user is located and confirms whether or not the user is out of the preset proper range due to the movement of the user, Detecting the direction in which the user departs, and controlling the unmanned airplane to rotate according to a direction in which the detected user departs from the unmanned airplane.
The method according to claim 1,
In the unmanned air vehicle,
If the operation mode of the UAV is set to the captured image based tracking mode, the user is detected in the captured image, and whether or not the user is larger or smaller than a predetermined size in the captured image, Controlling the unmanned airplane to stop traveling or to reverse the unmanned airplane if it is determined that the unmanned airplane is larger than the preset size, system.
The method according to claim 1,
The mobile terminal,
The method comprising: receiving GPS position information of the unmanned airplane from the unmanned airplane; detecting GPS position information of the mobile terminal; using the GPS position information of the unmanned airplane and the GPS position information of the mobile terminal, Calculating a distance and an azimuth between the terminals and generating a flight control signal based on the GPS position information using the distance and the azimuth between the UAV and the mobile terminal.
[8] has been abandoned due to the registration fee. 8. The method of claim 7,
The mobile terminal generates the flight control signal based on the GPS position information,
And an unmanned airplane control system for generating a flight control signal including a forward signal when a distance between the unmanned airplane and the mobile terminal is equal to or greater than a predetermined distance, .
[Claim 9 is abandoned upon payment of registration fee.] 8. The method of claim 7,
The mobile terminal generates the flight control signal based on the GPS position information,
Calculating a difference between an azimuth angle between the unmanned airplane and the mobile terminal and a current azimuth angle of the unmanned airplane from the unmanned airplane, calculating the difference between the azimuth angle between the unmanned airplane and the mobile terminal and the current azimuth angle of the unmanned airplane, Wherein the control signal generating means generates the flight control signal including the rotation control signal for controlling the rotation of the unmanned airplane.
delete delete delete It is checked whether the GPS position information of the UAV is detected in the UAV,
Setting an operation mode to one of a GPS-based tracking mode and a photographing image-based tracking mode depending on whether GPS position information of the UAV is detected,
If the operation mode of the UAV is set to the GPS based tracking mode, the mobile terminal controls the flight of the UAV according to the flight control signal generated using the GPS position information of the UAV and the GPS position information of the mobile terminal ,
Wherein when the operation mode of the UAV is set to an image-capturing-based tracking mode, the image of the UAV is analyzed to control the flight of the UAV,
Wherein the unmanned airplane checks whether an obstacle is present in front of the unmanned airplane through an obstacle detection sensor provided in advance and generates an obstacle map for the vicinity of the unmanned airplane when an obstacle exists in front of the unmanned airplane, And a controller for controlling the flight of the unmanned airplane according to the avoidance path to check whether or not an obstacle is detected in front of the unmanned airplane and detecting an obstacle in front of the unmanned airplane, Further comprising switching the operation mode of the unmanned airplane from the obstacle avoidance mode to the tracking mode to control the unmanned airplane if it is not detected,
Controlling the operation of the unmanned airplane by switching the operation mode of the unmanned airplane from the obstacle avoidance mode to the tracking mode is performed by checking whether the GPS position information of the unmanned airplane is detected and when the GPS position information of the unmanned airplane is detected, Based tracking mode, and if the GPS position information of the unmanned airplane is not detected, it is checked whether the user is detected by analyzing the surrounding image photographed at the current position of the unmanned airplane, If the user is not detected, controls the unmanned airplane to move within a certain radius until the GPS position information of the unmanned airplane is detected, Detecting the GPS position information of the unmanned airplane, After detecting the location information drone control method for controlling the unmanned aircraft in accordance with the GPS based tracking mode.
KR1020150132831A 2015-09-21 2015-09-21 Apparatus and method for controlling unmanned aerial vehicle KR101758093B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150132831A KR101758093B1 (en) 2015-09-21 2015-09-21 Apparatus and method for controlling unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150132831A KR101758093B1 (en) 2015-09-21 2015-09-21 Apparatus and method for controlling unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
KR20170034503A KR20170034503A (en) 2017-03-29
KR101758093B1 true KR101758093B1 (en) 2017-07-14

Family

ID=58497991

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150132831A KR101758093B1 (en) 2015-09-21 2015-09-21 Apparatus and method for controlling unmanned aerial vehicle

Country Status (1)

Country Link
KR (1) KR101758093B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200067743A (en) * 2018-11-02 2020-06-12 광주과학기술원 Fish net surveillance apparatus using Remotely-Operated underwater Vehicle, controlling method of the same

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102290746B1 (en) * 2017-04-12 2021-08-19 삼성전자주식회사 Device for controlling unmanned aerial vehicle, unmanned aerial vehicle controlled by the device and system
KR101943911B1 (en) * 2017-04-20 2019-01-30 아주대학교산학협력단 Method and apparatus for tracing position using drone
WO2018236181A1 (en) * 2017-06-23 2018-12-27 코아글림 주식회사 Mobile platform-based ahrs flight control apparatus
KR101924729B1 (en) * 2017-08-02 2018-12-03 서울대학교산학협력단 System and method for providing total logistic using drone
KR101943823B1 (en) * 2017-10-31 2019-01-31 주식회사 두시텍 UAV for accurate position data acquisition during high-speed flight and Determination of Unmanned Mission Equipment Synchronization for Accurate Position Data Acquisition during High Speed Flight
CN108860528A (en) * 2018-08-12 2018-11-23 鲲之眼(深圳)科技有限公司 Unmanned plane under a kind of modular intelligent water
CN109839113B (en) * 2019-03-18 2024-04-26 成都中科遥数智创科技有限公司 Method and device for controlling unmanned aerial vehicle to return to HOME position after GPS (Global positioning System) beyond visual range
KR102162055B1 (en) * 2019-12-20 2020-10-06 한국전자기술연구원 Intelligent Accelerator for UAV
KR102167414B1 (en) * 2020-04-27 2020-10-19 이한구 System and method for providing traffic violation detecting service using drone
CN115602003A (en) * 2022-09-29 2023-01-13 亿航智能设备(广州)有限公司(Cn) Unmanned aerial vehicle flight obstacle avoidance method, system and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100962615B1 (en) * 2008-01-17 2010-06-10 대한민국(관리부서:국립수산과학원) Observation system of measurement the sea circumstances and aerial vehicle with unmanned and methods thereof
KR101501528B1 (en) * 2013-10-01 2015-03-11 재단법인대구경북과학기술원 System and method for unmanned aerial vehicle collision avoidance
CN104777847A (en) * 2014-01-13 2015-07-15 中南大学 Unmanned aerial vehicle target tracking system based on machine vision and ultra-wideband positioning technology

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100962615B1 (en) * 2008-01-17 2010-06-10 대한민국(관리부서:국립수산과학원) Observation system of measurement the sea circumstances and aerial vehicle with unmanned and methods thereof
KR101501528B1 (en) * 2013-10-01 2015-03-11 재단법인대구경북과학기술원 System and method for unmanned aerial vehicle collision avoidance
CN104777847A (en) * 2014-01-13 2015-07-15 中南大学 Unmanned aerial vehicle target tracking system based on machine vision and ultra-wideband positioning technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200067743A (en) * 2018-11-02 2020-06-12 광주과학기술원 Fish net surveillance apparatus using Remotely-Operated underwater Vehicle, controlling method of the same
KR102234697B1 (en) * 2018-11-02 2021-04-02 광주과학기술원 Fish net surveillance apparatus using Remotely-Operated underwater Vehicle, controlling method of the same

Also Published As

Publication number Publication date
KR20170034503A (en) 2017-03-29

Similar Documents

Publication Publication Date Title
KR101758093B1 (en) Apparatus and method for controlling unmanned aerial vehicle
US11448770B2 (en) Methods and systems for detecting signal spoofing
KR101948728B1 (en) Method and system for collecting data
US9599473B2 (en) Utilizing magnetic field based navigation
KR101728123B1 (en) Simultaneous Localization and Mapping by Using Earth's Magnetic Fields
CN107223200B (en) Navigation method, navigation device and terminal equipment
EP3251953B1 (en) Method and apparatus for landing flight device
US20180059659A1 (en) Delivery System and Delivery Method
US20170371353A1 (en) Automatic Tracking Mode For Controlling An Unmanned Aerial Vehicle
US20170234724A1 (en) Device for uav detection and identification
US20170201714A1 (en) Electronic device for generating video data
US10616533B2 (en) Surveillance system and method for camera-based surveillance
US20160116915A1 (en) Surveying areas using a radar system and an unmanned aerial vehicle
CN108521800A (en) Control method, control terminal and the machine readable storage medium of automatic driving vehicle
KR101959366B1 (en) Mutual recognition method between UAV and wireless device
CN110823218A (en) Object tracking system
US20190286928A1 (en) Mobile micro-location
Isaacs et al. GPS-optimal micro air vehicle navigation in degraded environments
Kwak et al. Emerging ICT UAV applications and services: Design of surveillance UAVs
CN106200654A (en) The control method of unmanned plane during flying speed and device
KR20120067479A (en) Navigation system using picture and method of cotnrolling the same
Souli et al. Cooperative relative positioning using signals of opportunity and inertial and visual modalities
US11561553B1 (en) System and method of providing a multi-modal localization for an object
KR20160128967A (en) Navigation system using picture and method of cotnrolling the same
US10726692B2 (en) Security apparatus and control method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant