KR20170090603A - Method and system for controlling drone using hand motion tracking - Google Patents

Method and system for controlling drone using hand motion tracking Download PDF

Info

Publication number
KR20170090603A
KR20170090603A KR1020160011144A KR20160011144A KR20170090603A KR 20170090603 A KR20170090603 A KR 20170090603A KR 1020160011144 A KR1020160011144 A KR 1020160011144A KR 20160011144 A KR20160011144 A KR 20160011144A KR 20170090603 A KR20170090603 A KR 20170090603A
Authority
KR
South Korea
Prior art keywords
hand
control signal
movement
controlling
communication network
Prior art date
Application number
KR1020160011144A
Other languages
Korean (ko)
Inventor
박상욱
강찬우
황혜진
Original Assignee
아주대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 아주대학교산학협력단 filed Critical 아주대학교산학협력단
Priority to KR1020160011144A priority Critical patent/KR20170090603A/en
Publication of KR20170090603A publication Critical patent/KR20170090603A/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0022Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • G06K9/00355
    • G06K9/00389
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • H04W88/06Terminal devices adapted for operation in multiple networks or having at least two operational modes, e.g. multi-mode terminals
    • B64C2201/024
    • B64C2201/146

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

According to an embodiment of the present invention, provided is a drone control system using the motion recognition of a hand, which includes: a computing device which generates a first control signal and a second control signal based on the image processing of a stereoscopic image for the motion of the hand; a flight unit which includes a flight device of a drone whose motion is controlled based on the first control signal received through a first communication network; and a driving unit including a driving device which is controlled to mount or separate an object based on the second control signal received through the second communication network and is mounted on the flight device of the drone. Accordingly, the present invention can improve the efficiency of work by moving the object through the control of the drone.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a drones control system,

[0001] The present invention relates to a drones control system and method using hand motion recognition, and more particularly to a drones control system for a dron in response to the movement of a hand, And more particularly, to a system and method capable of moving an object by controlling a dron only by the movement of a hand.

Drone refers to a small unmanned aerial vehicle that is similar to a plane or helicopter. This unmanned aerial vehicle, which was the first to be used for military purposes, is routinely used in a variety of fields such as agriculture, shooting and shipping, and is emerging as a new industry with an average annual growth rate of 8%. The drones industry is expected to expand to a large scale in the future, and the possibilities are endless. However, since the existing drone uses the control method through the RC controller, there is a disadvantage that the convenience of the user is insufficient and the operation can not be precisely performed.

On the other hand, the Kinect sensor is a camera sensor specialized in the extraction and tracking of human joints. The Kinect sensor can detect the user's position and motion using 3D stereoscopic image data.

The purpose of the present invention is to control the drone by only the movement of the hand.

Another object of the present invention is to move an object by controlling a dron by only the movement of a hand.

It is another object of the present invention to provide a method of using an existing communication network and a transceiver in drones control.

According to an embodiment of the present invention, there is provided a computing device for generating a first control signal and a second control signal based on image processing of stereoscopic image data on motion of a hand obtained from a Kinect sensor; A flight unit including a flight device of the drones, the movement of which is controlled based on the first control signal received via the first communication network; A driving unit including a driving device mounted on a flying device of the drones, the driving device being controlled to be able to mount or separate an object based on the second control signal received via a second communication network; A drones control system using a motion recognition of a hand including a hand.

In the present invention, the computing device may include: a Kinect sensor for three-dimensionally capturing an operation of a user to acquire stereoscopic image data; A skeleton recognition unit for recognizing a skeleton from the image data acquired by the kinect sensor; A hand motion determination unit for determining a motion of the hand based on the recognized skeleton; A control signal generator for generating the first control signal or the second control signal from the motion of the hand; . ≪ / RTI >

In the present invention, the control signal generator detects whether there is a movement of a hand that is effective in the movement of the determined hand, and generates a first control signal capable of controlling the flight device of the drones when there is movement of a valid hand can do.

In the present invention, the control signal generator detects whether there is a chess piece of a hand available for the determined hand movement, and when the hand gesture of a valid hand is present, the control signal generator generates the second control signal Can be generated.

In the present invention, the driving device may be an electromagnet which is turned on / off according to an electrical signal.

In the present invention, the first communication network may be a WiFi communication network, and the second communication network may be a Bluetooth communication network.

In the present invention, the driving unit may further include a controller for converting the second control signal into an electrical signal for controlling the driving unit.

In the present invention, the image processing may perform skeleton recognition, hand tracking, or finger tracking to detect movement of a user's hand.

In the present invention, the first control signal may be a signal for controlling the flying device of the drones so that the dron can move in correspondence with the movement of the hand.

In the present invention, the second control signal controls the driving device to mount an object when the gesture of the hand determined by the movement of the hand is a gesture to grasp the fist, and the gesture of the hand causes a gesture The driving device may be a signal for controlling the object to be separated.

According to another embodiment of the present invention, there is provided a method of generating stereoscopic image data, comprising the steps of: generating a first control signal and a second control signal based on image processing of stereoscopic image data on movement of a hand obtained from a Kinect sensor; Controlling movement of the drones of the flight device based on the first control signal received via the first communication network; Controlling the driving device mounted on the flying device of the drones so as to be able to mount or separate objects based on the second control signal received through the second communication network; There is provided a dron control method using motion recognition of a hand.

In the present invention, the step of generating the first control signal and the second control signal includes: Acquiring stereoscopic image data by stereoscopically capturing an operation of a user using a Kinect sensor; Recognizing a skeleton from the image data acquired by the Kinect sensor; Determining movement of the hand based on the recognized skeleton; And a control signal generating step of generating the first control signal or the second control signal from the motion of the hand.

In the present invention, the control signal generation step may include a step of detecting whether there is a movement of a hand corresponding to the determined movement of the hand, and if the movement of the hand is valid, a first control signal capable of controlling the flight device of the dron Can be generated.

In the present invention, the control signal generation step may include detecting the presence of a valid chess piece of the hand in the determined hand movement, and if the effective hand gesture exists, Lt; / RTI >

In the present invention, the driving device may be an electromagnet which is turned on / off according to an electrical signal.

In the present invention, the first communication network may be a WiFi communication network, and the second communication network may be a Bluetooth communication network.

In the present invention, controlling the driving device may include converting the second control signal into an electrical signal for controlling the driving device.

In the present invention, the image processing may perform skeleton recognition, hand tracking, or finger tracking to detect movement of a user's hand.

In the present invention, the first control signal may be a signal for controlling the flying device of the drones so that the dron can move in correspondence with the movement of the hand.

In the present invention, the second control signal controls the driving device to mount an object when the gesture of the hand determined by the movement of the hand is a gesture to grasp the fist, and the gesture of the hand causes a gesture The driving device may be a signal for controlling the object to be separated.

According to the present invention, the flight device of the dron is controlled in response to the movement of the hand, and the driving device of the dron is controlled according to the gesture of the hand, so that the object can be moved by controlling the dron only by the movement of the hand, Can be increased.

According to the present invention, the dron control method by the motion recognition of the user's hand is provided instead of the dron control method using the conventional RC (Radio Control) controller, so that the convenience of the dron control can be improved. That is, according to the embodiment of the present invention, since the motion of the user's hand is analyzed through the image processing and the control data of the drone is generated therefrom without a separate control device, the degree of freedom in the method of controlling the dron is increased, Can be achieved.

In addition, according to the embodiment of the present invention, it is possible to implement not only the movement of the dron but also the dron as a specific action due to the motion recognition of the hand, so that the drone can perform a sophisticated operation impossible with the conventional RC controller.

In addition, according to one embodiment of the present invention, since the conventional communication network and the transceiver are used, it is possible to broaden the application range of the drone control.

FIG. 1 is a schematic view of apparatuses used in a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.
FIG. 2 is a diagram illustrating an internal configuration of a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.
FIGS. 3A through 3C are diagrams illustrating a drone moving in response to a first control signal according to an embodiment of the present invention. FIG.
4A and 4B are diagrams illustrating a dron that operates in response to a second control signal in an embodiment of the present invention.
FIG. 5 is a flowchart illustrating a dron control method using hand motion recognition according to an embodiment of the present invention in time sequence.

The following detailed description of the invention refers to the accompanying drawings, which illustrate, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, the specific shapes, structures, and characteristics described herein may be implemented by changing from one embodiment to another without departing from the spirit and scope of the invention. It should also be understood that the location or arrangement of individual components within each embodiment may be varied without departing from the spirit and scope of the present invention. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of the present invention should be construed as encompassing the scope of the appended claims and all equivalents thereof. In the drawings, like reference numbers designate the same or similar components throughout the several views.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings in order to facilitate a person skilled in the art to which the present invention pertains.

FIG. 1 is a schematic view of apparatuses used in a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.

1, a drones control system according to an embodiment of the present invention includes a computing device 1, a flight device 2, and an electromagnet 3. According to an embodiment of the present invention, a configuration in which the electromagnet 3 is mounted on the flight device 2 can constitute a drone 4. That is, the drone 4 according to one embodiment of the present invention is capable of moving in three dimensions using the flight device 2, and is capable of detaching and attaching the object using the electromagnet 3 mounted on the flight device 2 .

In more detail, according to an embodiment of the present invention, the computing device 1 recognizes and determines movement of a user's hand through image processing, and generates a control signal of the drones 4. At this time, the control signal generated by the computing device 1 includes a first control signal for controlling the flight device 2 of the drone 4 and a second control signal for controlling the electromagnet 3 of the drone 4 . Also, the first control signal is a signal that matches the movement of the hand, and the second control signal is a signal that matches the gesture of the hand. The flying device 2 of the drones 4 that has received the first control signal moves to the target position so as to correspond to the movement of the hand and the electromagnet 3 that has received the second control signal moves to the electromagnet 3) is ON / OFF.

In addition, the flight device 2 according to an embodiment of the present invention is an apparatus that allows the drones 4 to move on a three-dimensional space, and can control the posture and the propeller according to the first control signal to move to the target position . With respect to the flight device 2 of the drone 4, known drone flight devices can be used without limitation.

In addition, the electromagnet 3 according to an embodiment of the present invention is a magnet that includes iron or is turned on / off by an electrical signal so that an object having magnetism can be detached and attached to the dron 4. According to the embodiment of the present invention, since the drone 4 moves with the object attached to the electromagnet 3, the object can be moved by the drone 4.

Hereinafter, the operation method of the drones control system using the hand motion recognition including the computing device 1, the flying device 2 of the drone 4, and the electromagnet 3 of the drone 4 will be described with reference to the internal configuration Let's find out with help.

FIG. 2 is a diagram illustrating an internal configuration of a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.

Referring to FIG. 2, the dron control system of the present invention includes a control server 100, a flight unit 200, a driving unit 300, a first control unit 100 for controlling communication between the control server 100 and the flight unit 200, A communication network 400a, and a second communication network 400b for performing communication between the control server 100 and the driving unit 300. [

First, the control server 100 of the drones control system may be a server implemented in the computing device 1 shown in FIG. 1, or a server implemented in one or more computing devices 1. More specifically, the control server 100 includes a Kinect sensor 110, a skeleton recognition unit 120, a hand motion determination unit 130, a control signal generation unit 140, a first transceiver 150a, And a control unit 160. The control unit 160 includes a control unit 160,

The Kinect sensor 110 of the control server 100 captures stereoscopic image data by capturing the motion of the user. The Kinect sensor 110 is a sensor capable of generating three-dimensional image data that can be used to obtain information on a user's body motion without a separate controller. According to an embodiment of the present invention, the Kinect sensor 110 is used to analyze the movement of the user's hand through an image processing without a separate control device such as an RC controller, and the control data of the drones 4 is generated therefrom Therefore, the degree of freedom in the method of controlling the drones 4 is increased and the number of apparatuses can be reduced.

Although the Kinect sensor 110 is shown as being included in the control server 100 in the embodiment of FIG. 2 of the present invention, the Kinect sensor 110 may be external to the control server 100, And may be a sensor that is an external device separately from the sensor. When the Kinect sensor 110 exists separately from the control server 100 or the computer device, the control server 100 can receive the image captured by the Kinect sensor 110 via the wired / wireless communication network.

In addition, in one embodiment of the present invention, the Kinect sensor 110 may use a known commercially available Kinect sensor 110. At this time, the Kinect sensor 110 may be a Kinect sensor 110 that provides a stereoscopic image capable of discriminating the movement of a user's body, in particular, the movement of a hand. In an embodiment of the present invention, it can be used after being set to be communicable with the control server 100 by using a known Kinect sensor 110. [ Alternatively, the kinect sensor 110 of the present invention may be installed in the computing device 1.

Next, the skeleton recognizing unit 120 recognizes the skeleton from the stereoscopic image data acquired by the kinect sensor 110. FIG. In particular, according to an embodiment of the present invention, the skeleton recognition unit 120 first roughly recognizes the entire skeleton of the body from the stereoscopic image data recognized by the kinect sensor 110, matches the general skeleton with a general body skeleton, It is possible to recognize stereoscopic image data of the skeleton of the recognized hand after recognizing where the skeleton is positioned and recognizing the skeleton of the hand.

Next, the hand motion determiner 130 refers to the result of recognizing the skeleton of the stereoscopic image data, and determines the motion of the hand. More specifically, the skeleton recognition unit 120 recognizes and detects skeletons in a stereoscopic image acquired by the Kinect sensor 110 through image processing, and the detected skeletons determine movement of the hand It can be a foundation. At this time, the movement of the hand may include the movement of the hand in the three-dimensional space and the gesture of the hand.

For example, when the skeleton of the detected hand moves upward from below in the space according to the change in time, the hand motion determiner 130 may determine that the user's hand has moved upward from below. In addition, the hand motion determiner 130 may acquire parameters related to movements such as a hand moving distance, a hand moving speed, and the like as well as a hand moving direction.

In addition, the hand movement determination unit 130 can grasp not only the movement of the hand but also the gesture taken by the user's hand. The gesture used in the present invention represents a specific hand or hand shape change to convey certain information. For example, a gesture may be a physical expression for communicating information that a user desires to catch an object, and a gesture of hand gesture may be a physical expression for conveying information that a user desires to drop the item.

Next, the control signal generator 140 determines whether there is a valid motion in the determined hand motion, and generates the control signal of the dron 4 from the motion of the hand. At this time, the effective hand motion is a motion that can be judged that the user has moved his / her hand for the purpose of controlling the drones 4 because the size of the hand motion is greater than a predetermined value. The control signal of the drones 4 generated by the control signal generator 140 includes a first control signal transmitted to the flight device 2 and a second control signal transmitted to the electromagnet 3.

More specifically, the control signal generator 140 detects whether there is a hand movement that is effective in the determined hand movement and controls the flight device 2 of the drones 4 when there is a movement of a valid hand And generates a first control signal. At this time, the control signal generator 140 may perform hand tracking to detect movement of the hand.

In addition, the control signal generator 140 detects whether there is a hand gesture that is valid for the determined hand motion, and controls the electromagnet 3 of the drones 4 in the presence of a valid hand gesture. And generates a control signal. At this time, the control signal generator 140 may perform finger tracking to detect a hand gesture.

The first transmission / reception unit 150a transmits the generated first control signal to the flight unit 200 via the first communication network 400a, and the second transmission / reception unit 150b transmits the generated second control signal to the second communication network 400a. To the driving unit 300 through the second switching unit 400b. According to an embodiment of the present invention, the first communication network 400a may be a WiFi communication network and the second communication network 400b may be a Bluetooth communication network. As described above, according to the present invention, since the conventional communication network and the transceiver are used, it is possible to broaden the application range of the drones 4 control.

In addition, the control unit 160 controls each of the units in the control server 100 to perform their respective roles.

Next, the flight unit 200 includes a transmission / reception unit 210 and a flight device 2. [ In an embodiment of the present invention, the flight unit 200 may exist in a form in which the transmitter-receiver unit 210 is coupled to the flight apparatus 2, or a form in which the transmitter-receiver unit 210 is embedded. The transceiver 210 of the flight unit 200 receives the first control signal transmitted through the first communication network 400a. As described above, the first control signal is a signal for controlling the flight device 2. Since the dragon 4 can move in three dimensions by the flight device 2, when the flight device 2 is controlled according to the first control signal, the dragon 4 is moved accordingly.

FIGS. 3A through 3C are diagrams illustrating a drone 4 moving in response to a first control signal according to an embodiment of the present invention.

First, FIG. 3A illustrates moving the drone 4 in the x-axis direction when the user's hand is moved in the x-axis direction according to an embodiment of the present invention. That is, the control server 100 processes the stereoscopic image data of the hand obtained from the Kinect sensor 110 to judge the movement of the hand, and when there is movement of the hand in the x- And generates a first control signal for controlling the device 2 to move in the same direction. The flight device 2 of the flight unit 200 receiving the generated first signal through the first communication network 400a moves in the x-axis direction.

FIG. 3B is a diagram illustrating the movement of the drone 4 in the y-axis direction when the user's hand is moved in the y-axis direction according to an embodiment of the present invention on the same principle as FIG. 3A. 3C is a diagram illustrating the movement of the drones 4 in the z-axis direction when the user's hand is moved in the z-axis direction according to an embodiment of the present invention on the same principle as in Fig. 3A.

3A to 3C, the flight device 2 of the drones 4 is controlled so as to correspond to the movement of the user's hand on the three-dimensional space, so that the degree of freedom in the manner of controlling the drones 4 And the number of devices can be reduced.

Next, the driving unit 300 includes a transmitting / receiving unit 310, a controller 320, and an electromagnet 3. The driving unit 300 according to an embodiment of the present invention may include an electromagnet 3 as a driving device controlled to be able to mount or separate an object based on the second control signal received via the second communication network .

In one embodiment of the present invention, the driving unit 300 may exist in a form in which the transmission / reception unit 310 and the controller 320 are mounted on the electromagnet 3. The transmission / reception unit 310 of the driving unit 300 receives the second control signal transmitted through the second communication network 400b. As described above, the second control signal is a signal for controlling the electromagnet 3. Since an electromagnet 3 is controlled in accordance with the second control signal by turning on / off the electromagnet 3, an object containing iron or magnetism can be attached to and detached from the electromagnet 3, Can be detached and attached.

More specifically, the controller 320 of the driving unit 300 controls the electromagnet 3 to be turned on / off by the second control signal. That is, when the second control signal is a signal for controlling the electromagnet 3 to be ON, the controller 320 can generate an electric signal so that the electromagnet 3 can be turned ON and supply the electric signal to the electromagnet 3. The electromagnet 3 is turned on by the electric signal of the controller 320 so that the object is mounted and turned off by the electric signal so that the object is separated from the drones 4. In an embodiment of the present invention, May be an Arduino substrate electrically connected to the electromagnet 3.

That is, according to the embodiment of the present invention, it is possible to implement not only the movement of the drones 4 due to the recognition of the motion of the hands but also the specific actions of the drones 4, such as loading or unloading the drones 4, It is possible to allow the drone 4 to perform a sophisticated operation impossible with the conventional RC controller.

4A and 4B are diagrams illustrating a drone 4 that operates in response to a second control signal in an embodiment of the present invention.

4A is a diagram illustrating that an electromagnet 3 of a drone 4 is turned ON and an object can be mounted on a drone 4 when a user takes a hand gesture according to an embodiment of the present invention . That is, the control server 100 processes the stereoscopic image data of the hand obtained from the Kinect sensor 110 to determine the motion of the hand, and when there is a gesture of the hand holding the fist to the motion of the hand, 3) are turned on. The controller 320 receiving the generated second control signal through the second communication network 400b generates an electric signal capable of turning on the electromagnet 3, thereby turning on the electromagnet 3.

4B, when the user takes a hand gesture according to an embodiment of the present invention, the electromagnet 3 of the dron 4 is turned off and the object can be separated from the dron 4 Fig.

According to another embodiment of the present invention, the driving unit 300 may include a driving unit other than the electromagnet 3. That is, according to the present invention, the electromagnet 3 of the driving unit 300 can be replaced with a driving apparatus that can mount or separate an object. Although the electromagnet 3 has been illustrated as a main embodiment of the present invention for detaching and attaching an object to the drones 4, the object of the present invention is to move the object using the drones 4, An object mounting apparatus such as a clamp can be used instead of the electromagnet 3. [ In this case, the second control signal is a control signal for the gripper to load / unload the object.

FIG. 5 is a flowchart illustrating a dron control method using hand motion recognition according to an embodiment of the present invention in time sequence.

First, referring to FIG. 4, the Kinect sensor 110 of the computing device 1 captures stereoscopic image data by capturing an operation of a user (S1).

Next, the skeleton recognition unit 120 of the computing device 1 recognizes the skeleton from the stereoscopic image data (S2).

Next, the hand motion determiner 130 of the computing device 1 determines movement of the hand from the stereoscopic image data (S3).

Next, the control signal generator 140 generates the first control signal and the second control signal from the movement of the hand. More specifically, it is detected whether there is a movement of a hand in the determined movement of the hand (S4), and a first control signal is generated in the presence of movement of a valid hand (S5). If there is no valid hand movement, the process moves to step S7 described later.

Next, the first transmission / reception unit 150a transmits the generated first control signal to the flight unit 200 through the first communication network 400a (S6). The flight unit 200 controls the flight device 2 of the drones 4 based on the first control signal.

In addition, the control signal generator 140 detects whether there is a hand gesture effective for the determined hand movement (S7), and generates a second control signal when a valid hand gesture exists (S8). If there is no motion of the hand, the process returns to step S1 to continuously obtain the stereoscopic image data.

Next, the second transmitting and receiving unit 150b transmits the generated second control signal to the driving unit 300 through the second communication network (S7). The driving unit 300 controls the electromagnet 3 of the drones 4 based on the second control signal.

That is, according to the present invention, the user can move the drone 4 to the target position through the movement of the hand, and then mount the object at the target position on the drone 4, The drone 4 can be moved to another position and the object can be separated from the drone 4, so that the object can be moved to another place only by the movement of the hand. Therefore, according to the present invention, since the dron control method by recognizing the motion of the user's hand is provided, the convenience of the control of the drones 4 can be improved.

The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly mentioned, such as " essential ", " importantly ", etc., it may not be a necessary component for application of the present invention.

The use of the terms " above " and similar indication words in the specification of the present invention (particularly in the claims) may refer to both singular and plural. In addition, in the present invention, when a range is described, it includes the invention to which the individual values belonging to the above range are applied (unless there is contradiction thereto), and each individual value constituting the above range is described in the detailed description of the invention The same. Finally, the steps may be performed in any suitable order, unless explicitly stated or contrary to the description of the steps constituting the method according to the invention. The present invention is not necessarily limited to the order of description of the above steps. The use of all examples or exemplary language (e.g., etc.) in this invention is for the purpose of describing the present invention only in detail and is not to be limited by the scope of the claims, It is not. It will also be appreciated by those skilled in the art that various modifications, combinations, and alterations may be made depending on design criteria and factors within the scope of the appended claims or equivalents thereof.

The embodiments of the present invention described above can be implemented in the form of program instructions that can be executed through various computer components and recorded in a computer-readable recording medium. The computer-readable recording medium may include program commands, data files, data structures, and the like, alone or in combination. The program instructions recorded on the computer-readable recording medium may be those specifically designed and configured for the present invention or may be those known and used by those skilled in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROM and DVD, magneto-optical media such as floptical disks, medium, and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code, such as those generated by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware device may be modified into one or more software modules for performing the processing according to the present invention, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, Those skilled in the art will appreciate that various modifications and changes may be made thereto without departing from the scope of the present invention.

Accordingly, the spirit of the present invention should not be construed as being limited to the above-described embodiments, and all ranges that are equivalent to or equivalent to the claims of the present invention as well as the claims .

1: computing device 2: flight device
3: Electromagnet 4: Drones
100: control server 200:
300: Driving parts 400a, 400b: First and second communication networks

Claims (20)

A computing device for generating a first control signal and a second control signal based on the image processing of the stereoscopic image data on movement of the hand obtained from the Kinect sensor;
A flight unit including a flight device of the drones, the movement of which is controlled based on the first control signal received via the first communication network;
A driving unit including a driving device mounted on a flying device of the drones, the driving device being controlled to be able to mount or separate an object based on the second control signal received via a second communication network;
A drones control system using hand motion recognition.
The method according to claim 1,
The computing device comprising:
A Kinect sensor for stereoscopically capturing an operation of a user to acquire stereoscopic image data;
A skeleton recognition unit for recognizing a skeleton from the image data acquired by the kinect sensor;
A hand motion determination unit for determining a motion of the hand based on the recognized skeleton;
A control signal generator for generating the first control signal or the second control signal from the motion of the hand; A dron control system using hand motion recognition.
3. The method of claim 2,
Wherein the control signal generator detects whether there is a movement of a hand in response to the determined movement of the hand and generates a first control signal capable of controlling the flight device of the drones when there is movement of a valid hand, Drone control system using recognition.
3. The method of claim 2,
Wherein the control signal generator is operable to detect whether there is a valid chess piece in the determined hand movement and to generate the second control signal capable of controlling the driving device of the dron when there is a gesture of a valid hand, Drone control system using motion recognition.
The method according to claim 1,
Wherein the driving device is an electromagnet which is turned on / off according to an electrical signal.
The method according to claim 1,
Wherein the first communication network is a WiFi communication network and the second communication network is a bluetooth communication network.
The method according to claim 1,
Wherein the driving unit further comprises a controller for converting the second control signal into an electrical signal for controlling the driving device.
The method according to claim 1,
Wherein the image processing is skeleton recognition, hand tracking, or finger tracking to detect movement of a user's hand.
The method according to claim 1,
Wherein the first control signal is a signal for controlling the flying device of the dron so that the dron can move in response to the movement of the hand.
The method according to claim 1,
Wherein the second control signal controls the driving device to mount an object when the gesture of the hand determined by the motion of the hand is a gesture to hold a fist, and when the gesture of the hand is a gesture to stretch a hand, A dron control system using hand motion recognition, which is a signal for controlling an object to be separated.
Generating a first control signal and a second control signal based on the image processing of the stereoscopic image data on the motion of the hand obtained from the Kinect sensor;
Controlling movement of the drones of the flight device based on the first control signal received via the first communication network;
Controlling the driving device mounted on the flying device of the drones so as to be able to mount or separate objects based on the second control signal received through the second communication network;
A method of controlling a dron using a motion recognition of a hand including a hand.
12. The method of claim 11,
Wherein the generating the first control signal and the second control signal comprises:
Acquiring stereoscopic image data by stereoscopically capturing an operation of a user using a Kinect sensor;
Recognizing a skeleton from the image data acquired by the Kinect sensor;
Determining movement of the hand based on the recognized skeleton;
And a control signal generating step of generating the first control signal or the second control signal from the motion of the hand.
13. The method of claim 12,
The control signal generation step may include detecting a valid hand movement in the determined hand movement and generating a first control signal capable of controlling the flying device of the drones when there is a movement of a valid hand, Dron Control Method Using Motion Recognition.
13. The method of claim 12,
Wherein the control signal generation step comprises the step of generating a second control signal capable of controlling the driving device of the drones when a valid hand gesture exists, A drones control method using motion recognition of.
12. The method of claim 11,
Wherein the driving device is an electromagnet which is turned on / off according to an electrical signal.
12. The method of claim 11,
Wherein the first communication network is a WiFi communication network and the second communication network is a Bluetooth communication network.
12. The method of claim 11,
Wherein the step of controlling the driving device comprises the step of converting the second control signal into an electrical signal for controlling the driving device.
12. The method of claim 11,
Wherein the image processing performs skeletal recognition, hand tracking, or finger tracking to detect movement of a user's hand.
12. The method of claim 11,
Wherein the first control signal is a signal for controlling the flight device of the drones so that the dron can move in correspondence with the movement of the hand.
12. The method of claim 11,
Wherein the second control signal controls the driving device to mount an object when the gesture of the hand determined by the motion of the hand is a gesture to hold a fist, and when the gesture of the hand is a gesture to stretch a hand, Wherein the signal is a signal for controlling an object to be separated.
KR1020160011144A 2016-01-29 2016-01-29 Method and system for controlling drone using hand motion tracking KR20170090603A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160011144A KR20170090603A (en) 2016-01-29 2016-01-29 Method and system for controlling drone using hand motion tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160011144A KR20170090603A (en) 2016-01-29 2016-01-29 Method and system for controlling drone using hand motion tracking

Publications (1)

Publication Number Publication Date
KR20170090603A true KR20170090603A (en) 2017-08-08

Family

ID=59653005

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160011144A KR20170090603A (en) 2016-01-29 2016-01-29 Method and system for controlling drone using hand motion tracking

Country Status (1)

Country Link
KR (1) KR20170090603A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171121A (en) * 2017-12-11 2018-06-15 翔升(上海)电子技术有限公司 UAV Intelligent tracking and system
KR101896239B1 (en) * 2017-09-20 2018-09-07 (주)텔트론 System for controlling drone using motion capture
EP3499332A2 (en) 2017-12-14 2019-06-19 Industry Academy Cooperation Foundation Of Sejong University Remote control device and method for uav and motion control device attached to uav
KR20190076407A (en) 2017-12-22 2019-07-02 세종대학교산학협력단 Remote control device and method of uav
WO2019144300A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Target detection method and apparatus, and movable platform
KR102032067B1 (en) 2018-12-05 2019-10-14 세종대학교산학협력단 Remote control device and method of uav based on reforcement learning
JP2023081259A (en) * 2021-11-30 2023-06-09 仁寶電腦工業股▲ふん▼有限公司 Control device for unmanned aerial vehicle and control method thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101896239B1 (en) * 2017-09-20 2018-09-07 (주)텔트론 System for controlling drone using motion capture
CN108171121A (en) * 2017-12-11 2018-06-15 翔升(上海)电子技术有限公司 UAV Intelligent tracking and system
EP3499332A2 (en) 2017-12-14 2019-06-19 Industry Academy Cooperation Foundation Of Sejong University Remote control device and method for uav and motion control device attached to uav
KR20190076407A (en) 2017-12-22 2019-07-02 세종대학교산학협력단 Remote control device and method of uav
WO2019144300A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Target detection method and apparatus, and movable platform
KR102032067B1 (en) 2018-12-05 2019-10-14 세종대학교산학협력단 Remote control device and method of uav based on reforcement learning
US11567491B2 (en) 2018-12-05 2023-01-31 Industry Academy Cooperation Foundation Of Sejong University Reinforcement learning-based remote control device and method for an unmanned aerial vehicle
JP2023081259A (en) * 2021-11-30 2023-06-09 仁寶電腦工業股▲ふん▼有限公司 Control device for unmanned aerial vehicle and control method thereof
US11921523B2 (en) 2021-11-30 2024-03-05 Compal Electronics, Inc. Control device for unmanned aerial vehicle and control method therefor

Similar Documents

Publication Publication Date Title
KR20170090603A (en) Method and system for controlling drone using hand motion tracking
KR102529137B1 (en) Augmented reality display device with deep learning sensors
CN112513711B (en) Method and system for resolving hemispherical ambiguities using position vectors
US20160309124A1 (en) Control system, a method for controlling an uav, and a uav-kit
AU2021277680A1 (en) Electromagnetic tracking with augmented reality systems
CN109313417A (en) Help robot localization
US20200171667A1 (en) Vision-based robot control system
KR20190072944A (en) Unmanned aerial vehicle and operating method thereof, and automated guided vehicle for controlling movement of the unmanned aerial vehicle
JP2017174110A (en) Unmanned mobile device, takeover method, and program
CN107787497A (en) Method and apparatus for the detection gesture in the space coordinates based on user
WO2018078863A1 (en) Drone control system, method, and program
US20180315211A1 (en) Tracking system and method thereof
KR102460739B1 (en) Industrial Electronic Badges
CN105773619A (en) Electronic control system used for realizing grabbing behavior of humanoid robot and humanoid robot
Lippiello et al. 3D monocular robotic ball catching with an iterative trajectory estimation refinement
JP7230941B2 (en) Information processing device, control method, and program
Pradeep et al. Follow me robot using bluetooth-based position estimation
Luo et al. A scalable modular architecture of 3D object acquisition for manufacturing automation
Luo et al. Stereo Vision-based Autonomous Target Detection and Tracking on an Omnidirectional Mobile Robot.
CN109596126A (en) A kind of determination method and apparatus of robot space coordinates transformational relation
CN115657718A (en) Aircraft dynamic target tracking navigation method and device and readable medium
CN206291910U (en) The acquisition system of the attitude information of carrier
JP2009145296A (en) Object direction detecting system and data structure
Zhang et al. Vision based surface slope estimation for unmanned aerial vehicle perching
Jeong et al. Relative Pose estimation for an integrated UGV-UAV robot system

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal