KR20170090603A - Method and system for controlling drone using hand motion tracking - Google Patents
Method and system for controlling drone using hand motion tracking Download PDFInfo
- Publication number
- KR20170090603A KR20170090603A KR1020160011144A KR20160011144A KR20170090603A KR 20170090603 A KR20170090603 A KR 20170090603A KR 1020160011144 A KR1020160011144 A KR 1020160011144A KR 20160011144 A KR20160011144 A KR 20160011144A KR 20170090603 A KR20170090603 A KR 20170090603A
- Authority
- KR
- South Korea
- Prior art keywords
- hand
- control signal
- movement
- controlling
- communication network
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 128
- 238000000034 method Methods 0.000 title claims description 41
- 238000004891 communication Methods 0.000 claims abstract description 48
- 238000012545 processing Methods 0.000 claims abstract description 14
- 230000004044 response Effects 0.000 claims description 8
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 6
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 229910052742 iron Inorganic materials 0.000 description 2
- 230000005389 magnetism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C27/00—Rotorcraft; Rotors peculiar thereto
- B64C27/04—Helicopters
- B64C27/08—Helicopters with two or more rotors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0022—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G06K9/00355—
-
- G06K9/00389—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W88/00—Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
- H04W88/02—Terminal devices
- H04W88/06—Terminal devices adapted for operation in multiple networks or having at least two operational modes, e.g. multi-mode terminals
-
- B64C2201/024—
-
- B64C2201/146—
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
[0001] The present invention relates to a drones control system and method using hand motion recognition, and more particularly to a drones control system for a dron in response to the movement of a hand, And more particularly, to a system and method capable of moving an object by controlling a dron only by the movement of a hand.
Drone refers to a small unmanned aerial vehicle that is similar to a plane or helicopter. This unmanned aerial vehicle, which was the first to be used for military purposes, is routinely used in a variety of fields such as agriculture, shooting and shipping, and is emerging as a new industry with an average annual growth rate of 8%. The drones industry is expected to expand to a large scale in the future, and the possibilities are endless. However, since the existing drone uses the control method through the RC controller, there is a disadvantage that the convenience of the user is insufficient and the operation can not be precisely performed.
On the other hand, the Kinect sensor is a camera sensor specialized in the extraction and tracking of human joints. The Kinect sensor can detect the user's position and motion using 3D stereoscopic image data.
The purpose of the present invention is to control the drone by only the movement of the hand.
Another object of the present invention is to move an object by controlling a dron by only the movement of a hand.
It is another object of the present invention to provide a method of using an existing communication network and a transceiver in drones control.
According to an embodiment of the present invention, there is provided a computing device for generating a first control signal and a second control signal based on image processing of stereoscopic image data on motion of a hand obtained from a Kinect sensor; A flight unit including a flight device of the drones, the movement of which is controlled based on the first control signal received via the first communication network; A driving unit including a driving device mounted on a flying device of the drones, the driving device being controlled to be able to mount or separate an object based on the second control signal received via a second communication network; A drones control system using a motion recognition of a hand including a hand.
In the present invention, the computing device may include: a Kinect sensor for three-dimensionally capturing an operation of a user to acquire stereoscopic image data; A skeleton recognition unit for recognizing a skeleton from the image data acquired by the kinect sensor; A hand motion determination unit for determining a motion of the hand based on the recognized skeleton; A control signal generator for generating the first control signal or the second control signal from the motion of the hand; . ≪ / RTI >
In the present invention, the control signal generator detects whether there is a movement of a hand that is effective in the movement of the determined hand, and generates a first control signal capable of controlling the flight device of the drones when there is movement of a valid hand can do.
In the present invention, the control signal generator detects whether there is a chess piece of a hand available for the determined hand movement, and when the hand gesture of a valid hand is present, the control signal generator generates the second control signal Can be generated.
In the present invention, the driving device may be an electromagnet which is turned on / off according to an electrical signal.
In the present invention, the first communication network may be a WiFi communication network, and the second communication network may be a Bluetooth communication network.
In the present invention, the driving unit may further include a controller for converting the second control signal into an electrical signal for controlling the driving unit.
In the present invention, the image processing may perform skeleton recognition, hand tracking, or finger tracking to detect movement of a user's hand.
In the present invention, the first control signal may be a signal for controlling the flying device of the drones so that the dron can move in correspondence with the movement of the hand.
In the present invention, the second control signal controls the driving device to mount an object when the gesture of the hand determined by the movement of the hand is a gesture to grasp the fist, and the gesture of the hand causes a gesture The driving device may be a signal for controlling the object to be separated.
According to another embodiment of the present invention, there is provided a method of generating stereoscopic image data, comprising the steps of: generating a first control signal and a second control signal based on image processing of stereoscopic image data on movement of a hand obtained from a Kinect sensor; Controlling movement of the drones of the flight device based on the first control signal received via the first communication network; Controlling the driving device mounted on the flying device of the drones so as to be able to mount or separate objects based on the second control signal received through the second communication network; There is provided a dron control method using motion recognition of a hand.
In the present invention, the step of generating the first control signal and the second control signal includes: Acquiring stereoscopic image data by stereoscopically capturing an operation of a user using a Kinect sensor; Recognizing a skeleton from the image data acquired by the Kinect sensor; Determining movement of the hand based on the recognized skeleton; And a control signal generating step of generating the first control signal or the second control signal from the motion of the hand.
In the present invention, the control signal generation step may include a step of detecting whether there is a movement of a hand corresponding to the determined movement of the hand, and if the movement of the hand is valid, a first control signal capable of controlling the flight device of the dron Can be generated.
In the present invention, the control signal generation step may include detecting the presence of a valid chess piece of the hand in the determined hand movement, and if the effective hand gesture exists, Lt; / RTI >
In the present invention, the driving device may be an electromagnet which is turned on / off according to an electrical signal.
In the present invention, the first communication network may be a WiFi communication network, and the second communication network may be a Bluetooth communication network.
In the present invention, controlling the driving device may include converting the second control signal into an electrical signal for controlling the driving device.
In the present invention, the image processing may perform skeleton recognition, hand tracking, or finger tracking to detect movement of a user's hand.
In the present invention, the first control signal may be a signal for controlling the flying device of the drones so that the dron can move in correspondence with the movement of the hand.
In the present invention, the second control signal controls the driving device to mount an object when the gesture of the hand determined by the movement of the hand is a gesture to grasp the fist, and the gesture of the hand causes a gesture The driving device may be a signal for controlling the object to be separated.
According to the present invention, the flight device of the dron is controlled in response to the movement of the hand, and the driving device of the dron is controlled according to the gesture of the hand, so that the object can be moved by controlling the dron only by the movement of the hand, Can be increased.
According to the present invention, the dron control method by the motion recognition of the user's hand is provided instead of the dron control method using the conventional RC (Radio Control) controller, so that the convenience of the dron control can be improved. That is, according to the embodiment of the present invention, since the motion of the user's hand is analyzed through the image processing and the control data of the drone is generated therefrom without a separate control device, the degree of freedom in the method of controlling the dron is increased, Can be achieved.
In addition, according to the embodiment of the present invention, it is possible to implement not only the movement of the dron but also the dron as a specific action due to the motion recognition of the hand, so that the drone can perform a sophisticated operation impossible with the conventional RC controller.
In addition, according to one embodiment of the present invention, since the conventional communication network and the transceiver are used, it is possible to broaden the application range of the drone control.
FIG. 1 is a schematic view of apparatuses used in a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.
FIG. 2 is a diagram illustrating an internal configuration of a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.
FIGS. 3A through 3C are diagrams illustrating a drone moving in response to a first control signal according to an embodiment of the present invention. FIG.
4A and 4B are diagrams illustrating a dron that operates in response to a second control signal in an embodiment of the present invention.
FIG. 5 is a flowchart illustrating a dron control method using hand motion recognition according to an embodiment of the present invention in time sequence.
The following detailed description of the invention refers to the accompanying drawings, which illustrate, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, the specific shapes, structures, and characteristics described herein may be implemented by changing from one embodiment to another without departing from the spirit and scope of the invention. It should also be understood that the location or arrangement of individual components within each embodiment may be varied without departing from the spirit and scope of the present invention. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of the present invention should be construed as encompassing the scope of the appended claims and all equivalents thereof. In the drawings, like reference numbers designate the same or similar components throughout the several views.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings in order to facilitate a person skilled in the art to which the present invention pertains.
FIG. 1 is a schematic view of apparatuses used in a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.
1, a drones control system according to an embodiment of the present invention includes a computing device 1, a
In more detail, according to an embodiment of the present invention, the computing device 1 recognizes and determines movement of a user's hand through image processing, and generates a control signal of the drones 4. At this time, the control signal generated by the computing device 1 includes a first control signal for controlling the
In addition, the
In addition, the
Hereinafter, the operation method of the drones control system using the hand motion recognition including the computing device 1, the flying
FIG. 2 is a diagram illustrating an internal configuration of a drones control system using hand motion recognition according to an embodiment of the present invention. Referring to FIG.
Referring to FIG. 2, the dron control system of the present invention includes a
First, the
The
Although the
In addition, in one embodiment of the present invention, the
Next, the
Next, the
For example, when the skeleton of the detected hand moves upward from below in the space according to the change in time, the
In addition, the hand
Next, the
More specifically, the
In addition, the
The first transmission /
In addition, the
Next, the
FIGS. 3A through 3C are diagrams illustrating a drone 4 moving in response to a first control signal according to an embodiment of the present invention.
First, FIG. 3A illustrates moving the drone 4 in the x-axis direction when the user's hand is moved in the x-axis direction according to an embodiment of the present invention. That is, the
FIG. 3B is a diagram illustrating the movement of the drone 4 in the y-axis direction when the user's hand is moved in the y-axis direction according to an embodiment of the present invention on the same principle as FIG. 3A. 3C is a diagram illustrating the movement of the drones 4 in the z-axis direction when the user's hand is moved in the z-axis direction according to an embodiment of the present invention on the same principle as in Fig. 3A.
3A to 3C, the
Next, the driving
In one embodiment of the present invention, the driving
More specifically, the
That is, according to the embodiment of the present invention, it is possible to implement not only the movement of the drones 4 due to the recognition of the motion of the hands but also the specific actions of the drones 4, such as loading or unloading the drones 4, It is possible to allow the drone 4 to perform a sophisticated operation impossible with the conventional RC controller.
4A and 4B are diagrams illustrating a drone 4 that operates in response to a second control signal in an embodiment of the present invention.
4A is a diagram illustrating that an
4B, when the user takes a hand gesture according to an embodiment of the present invention, the
According to another embodiment of the present invention, the driving
FIG. 5 is a flowchart illustrating a dron control method using hand motion recognition according to an embodiment of the present invention in time sequence.
First, referring to FIG. 4, the
Next, the
Next, the
Next, the
Next, the first transmission /
In addition, the
Next, the second transmitting and receiving
That is, according to the present invention, the user can move the drone 4 to the target position through the movement of the hand, and then mount the object at the target position on the drone 4, The drone 4 can be moved to another position and the object can be separated from the drone 4, so that the object can be moved to another place only by the movement of the hand. Therefore, according to the present invention, since the dron control method by recognizing the motion of the user's hand is provided, the convenience of the control of the drones 4 can be improved.
The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly mentioned, such as " essential ", " importantly ", etc., it may not be a necessary component for application of the present invention.
The use of the terms " above " and similar indication words in the specification of the present invention (particularly in the claims) may refer to both singular and plural. In addition, in the present invention, when a range is described, it includes the invention to which the individual values belonging to the above range are applied (unless there is contradiction thereto), and each individual value constituting the above range is described in the detailed description of the invention The same. Finally, the steps may be performed in any suitable order, unless explicitly stated or contrary to the description of the steps constituting the method according to the invention. The present invention is not necessarily limited to the order of description of the above steps. The use of all examples or exemplary language (e.g., etc.) in this invention is for the purpose of describing the present invention only in detail and is not to be limited by the scope of the claims, It is not. It will also be appreciated by those skilled in the art that various modifications, combinations, and alterations may be made depending on design criteria and factors within the scope of the appended claims or equivalents thereof.
The embodiments of the present invention described above can be implemented in the form of program instructions that can be executed through various computer components and recorded in a computer-readable recording medium. The computer-readable recording medium may include program commands, data files, data structures, and the like, alone or in combination. The program instructions recorded on the computer-readable recording medium may be those specifically designed and configured for the present invention or may be those known and used by those skilled in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROM and DVD, magneto-optical media such as floptical disks, medium, and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code, such as those generated by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware device may be modified into one or more software modules for performing the processing according to the present invention, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, Those skilled in the art will appreciate that various modifications and changes may be made thereto without departing from the scope of the present invention.
Accordingly, the spirit of the present invention should not be construed as being limited to the above-described embodiments, and all ranges that are equivalent to or equivalent to the claims of the present invention as well as the claims .
1: computing device 2: flight device
3: Electromagnet 4: Drones
100: control server 200:
300: Driving
Claims (20)
A flight unit including a flight device of the drones, the movement of which is controlled based on the first control signal received via the first communication network;
A driving unit including a driving device mounted on a flying device of the drones, the driving device being controlled to be able to mount or separate an object based on the second control signal received via a second communication network;
A drones control system using hand motion recognition.
The computing device comprising:
A Kinect sensor for stereoscopically capturing an operation of a user to acquire stereoscopic image data;
A skeleton recognition unit for recognizing a skeleton from the image data acquired by the kinect sensor;
A hand motion determination unit for determining a motion of the hand based on the recognized skeleton;
A control signal generator for generating the first control signal or the second control signal from the motion of the hand; A dron control system using hand motion recognition.
Wherein the control signal generator detects whether there is a movement of a hand in response to the determined movement of the hand and generates a first control signal capable of controlling the flight device of the drones when there is movement of a valid hand, Drone control system using recognition.
Wherein the control signal generator is operable to detect whether there is a valid chess piece in the determined hand movement and to generate the second control signal capable of controlling the driving device of the dron when there is a gesture of a valid hand, Drone control system using motion recognition.
Wherein the driving device is an electromagnet which is turned on / off according to an electrical signal.
Wherein the first communication network is a WiFi communication network and the second communication network is a bluetooth communication network.
Wherein the driving unit further comprises a controller for converting the second control signal into an electrical signal for controlling the driving device.
Wherein the image processing is skeleton recognition, hand tracking, or finger tracking to detect movement of a user's hand.
Wherein the first control signal is a signal for controlling the flying device of the dron so that the dron can move in response to the movement of the hand.
Wherein the second control signal controls the driving device to mount an object when the gesture of the hand determined by the motion of the hand is a gesture to hold a fist, and when the gesture of the hand is a gesture to stretch a hand, A dron control system using hand motion recognition, which is a signal for controlling an object to be separated.
Controlling movement of the drones of the flight device based on the first control signal received via the first communication network;
Controlling the driving device mounted on the flying device of the drones so as to be able to mount or separate objects based on the second control signal received through the second communication network;
A method of controlling a dron using a motion recognition of a hand including a hand.
Wherein the generating the first control signal and the second control signal comprises:
Acquiring stereoscopic image data by stereoscopically capturing an operation of a user using a Kinect sensor;
Recognizing a skeleton from the image data acquired by the Kinect sensor;
Determining movement of the hand based on the recognized skeleton;
And a control signal generating step of generating the first control signal or the second control signal from the motion of the hand.
The control signal generation step may include detecting a valid hand movement in the determined hand movement and generating a first control signal capable of controlling the flying device of the drones when there is a movement of a valid hand, Dron Control Method Using Motion Recognition.
Wherein the control signal generation step comprises the step of generating a second control signal capable of controlling the driving device of the drones when a valid hand gesture exists, A drones control method using motion recognition of.
Wherein the driving device is an electromagnet which is turned on / off according to an electrical signal.
Wherein the first communication network is a WiFi communication network and the second communication network is a Bluetooth communication network.
Wherein the step of controlling the driving device comprises the step of converting the second control signal into an electrical signal for controlling the driving device.
Wherein the image processing performs skeletal recognition, hand tracking, or finger tracking to detect movement of a user's hand.
Wherein the first control signal is a signal for controlling the flight device of the drones so that the dron can move in correspondence with the movement of the hand.
Wherein the second control signal controls the driving device to mount an object when the gesture of the hand determined by the motion of the hand is a gesture to hold a fist, and when the gesture of the hand is a gesture to stretch a hand, Wherein the signal is a signal for controlling an object to be separated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160011144A KR20170090603A (en) | 2016-01-29 | 2016-01-29 | Method and system for controlling drone using hand motion tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160011144A KR20170090603A (en) | 2016-01-29 | 2016-01-29 | Method and system for controlling drone using hand motion tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170090603A true KR20170090603A (en) | 2017-08-08 |
Family
ID=59653005
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160011144A KR20170090603A (en) | 2016-01-29 | 2016-01-29 | Method and system for controlling drone using hand motion tracking |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170090603A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108171121A (en) * | 2017-12-11 | 2018-06-15 | 翔升(上海)电子技术有限公司 | UAV Intelligent tracking and system |
KR101896239B1 (en) * | 2017-09-20 | 2018-09-07 | (주)텔트론 | System for controlling drone using motion capture |
EP3499332A2 (en) | 2017-12-14 | 2019-06-19 | Industry Academy Cooperation Foundation Of Sejong University | Remote control device and method for uav and motion control device attached to uav |
KR20190076407A (en) | 2017-12-22 | 2019-07-02 | 세종대학교산학협력단 | Remote control device and method of uav |
WO2019144300A1 (en) * | 2018-01-23 | 2019-08-01 | 深圳市大疆创新科技有限公司 | Target detection method and apparatus, and movable platform |
KR102032067B1 (en) | 2018-12-05 | 2019-10-14 | 세종대학교산학협력단 | Remote control device and method of uav based on reforcement learning |
JP2023081259A (en) * | 2021-11-30 | 2023-06-09 | 仁寶電腦工業股▲ふん▼有限公司 | Control device for unmanned aerial vehicle and control method thereof |
-
2016
- 2016-01-29 KR KR1020160011144A patent/KR20170090603A/en not_active Application Discontinuation
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101896239B1 (en) * | 2017-09-20 | 2018-09-07 | (주)텔트론 | System for controlling drone using motion capture |
CN108171121A (en) * | 2017-12-11 | 2018-06-15 | 翔升(上海)电子技术有限公司 | UAV Intelligent tracking and system |
EP3499332A2 (en) | 2017-12-14 | 2019-06-19 | Industry Academy Cooperation Foundation Of Sejong University | Remote control device and method for uav and motion control device attached to uav |
KR20190076407A (en) | 2017-12-22 | 2019-07-02 | 세종대학교산학협력단 | Remote control device and method of uav |
WO2019144300A1 (en) * | 2018-01-23 | 2019-08-01 | 深圳市大疆创新科技有限公司 | Target detection method and apparatus, and movable platform |
KR102032067B1 (en) | 2018-12-05 | 2019-10-14 | 세종대학교산학협력단 | Remote control device and method of uav based on reforcement learning |
US11567491B2 (en) | 2018-12-05 | 2023-01-31 | Industry Academy Cooperation Foundation Of Sejong University | Reinforcement learning-based remote control device and method for an unmanned aerial vehicle |
JP2023081259A (en) * | 2021-11-30 | 2023-06-09 | 仁寶電腦工業股▲ふん▼有限公司 | Control device for unmanned aerial vehicle and control method thereof |
US11921523B2 (en) | 2021-11-30 | 2024-03-05 | Compal Electronics, Inc. | Control device for unmanned aerial vehicle and control method therefor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20170090603A (en) | Method and system for controlling drone using hand motion tracking | |
KR102529137B1 (en) | Augmented reality display device with deep learning sensors | |
CN112513711B (en) | Method and system for resolving hemispherical ambiguities using position vectors | |
US20160309124A1 (en) | Control system, a method for controlling an uav, and a uav-kit | |
AU2021277680A1 (en) | Electromagnetic tracking with augmented reality systems | |
CN109313417A (en) | Help robot localization | |
US20200171667A1 (en) | Vision-based robot control system | |
KR20190072944A (en) | Unmanned aerial vehicle and operating method thereof, and automated guided vehicle for controlling movement of the unmanned aerial vehicle | |
JP2017174110A (en) | Unmanned mobile device, takeover method, and program | |
CN107787497A (en) | Method and apparatus for the detection gesture in the space coordinates based on user | |
WO2018078863A1 (en) | Drone control system, method, and program | |
US20180315211A1 (en) | Tracking system and method thereof | |
KR102460739B1 (en) | Industrial Electronic Badges | |
CN105773619A (en) | Electronic control system used for realizing grabbing behavior of humanoid robot and humanoid robot | |
Lippiello et al. | 3D monocular robotic ball catching with an iterative trajectory estimation refinement | |
JP7230941B2 (en) | Information processing device, control method, and program | |
Pradeep et al. | Follow me robot using bluetooth-based position estimation | |
Luo et al. | A scalable modular architecture of 3D object acquisition for manufacturing automation | |
Luo et al. | Stereo Vision-based Autonomous Target Detection and Tracking on an Omnidirectional Mobile Robot. | |
CN109596126A (en) | A kind of determination method and apparatus of robot space coordinates transformational relation | |
CN115657718A (en) | Aircraft dynamic target tracking navigation method and device and readable medium | |
CN206291910U (en) | The acquisition system of the attitude information of carrier | |
JP2009145296A (en) | Object direction detecting system and data structure | |
Zhang et al. | Vision based surface slope estimation for unmanned aerial vehicle perching | |
Jeong et al. | Relative Pose estimation for an integrated UGV-UAV robot system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal |