CN102219051A - Method for controlling four-rotor aircraft system based on human-computer interaction technology - Google Patents
Method for controlling four-rotor aircraft system based on human-computer interaction technology Download PDFInfo
- Publication number
- CN102219051A CN102219051A CN2011101104714A CN201110110471A CN102219051A CN 102219051 A CN102219051 A CN 102219051A CN 2011101104714 A CN2011101104714 A CN 2011101104714A CN 201110110471 A CN201110110471 A CN 201110110471A CN 102219051 A CN102219051 A CN 102219051A
- Authority
- CN
- China
- Prior art keywords
- rotor crafts
- rotor
- computing machine
- controlling quantity
- transferred
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Manipulator (AREA)
- Selective Calling Equipment (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Toys (AREA)
Abstract
A method for controlling a four-rotor aircraft system based on the human-computer interaction technology, which belongs to the intelligent flying robot field, is characterized in that a manipulator can control the four-rotor aircraft by gestures. The flight attitude control is accomplished by the cooperative running of the four rotors which are arranged at geometrical vertexes of the four-rotor aircraft with three degrees of freedom being yaw angle, pitch angle and roll angle. The human-computer interaction technology mainly utilizes OpenCV and OpenGL. The system captures depth images of the manipulator hands through a depth camera; the depth images are analyzed and processed by the computer to obtain gesture information and generate control signals corresponding to the gesture information; and then the control signals are sent through a radio communication device to the aircraft for execution, so as to accomplish the mapping from the motion state of the manipulator hands to the motion state of the aircraft and complete the gesture control. With the help of far controlling distance and more visual gesture corresponding relation, the gesture control can be applied to danger experiments and industrial production processes with high execution difficulty.
Description
Technical field
The invention belongs to intelligent flying robot field, be specifically related to a kind of degree of depth camera that utilizes and catch the method that the human body gesture is finished the control of four rotor crafts.
Technical background
Four rotor crafts are operands of the present invention, and it is a kind of aircraft with four screw propellers, and four screw propellers are the square crossing structure, and two opposite groups four rotors have identical hand of rotation.Four rotor crafts seldom have comparatively ripe achievement to emerge also for the budding stage at home; But four rotor crafts are comparatively popular in the world, and many famous brand colleges and universities in west all obtain the leading position in four rotor crafts control, come in the middle of making that many fans join in four rotor crafts and research are also arranged.Four rotor crafts have been won vast aircraft fan's favor with the advantage that its volume is little, in light weight, flexibility ratio is high.
Control device of the present invention is the gesture identification of degree of depth camera, utilization be Natal degree of depth camera.Spatial image is handled the solution of more existing moulding, but all because its complicated algorithm and huge computing storehouse the researchist in many peripheral fields is kept outside of the door.Natal degree of depth camera that the present invention uses has used a visible image capturing head, infrared camera and infrared dot generator to finish the advanced treating of spatial image dexterously, is the gesture identification condition of providing convenience.This technology has begun to use to some extent in field of machine vision at present, and is well received by the public.
Summary of the invention
Thereby the object of the present invention is to provide a kind of method of utilizing depth camera to catch human body gesture controlling aircraft attitude, demonstrate fully the alerting ability and the convenience of the control of operator's gesture, reduce the difficulty of generally controlling, expand the field of application of small aircraft for aircraft.
Based on above-mentioned purpose, conception plans of the present invention is: constitute control system with four rotor crafts, planimetric position acquisition equipment and gesture identifying device.Its median-plane location acquisition equipment comprises the adjustable support of the parallel placement spacing with a group of environment infrared camera, communicate by the wireless video transport module between environment infrared camera and the hinge computing machine, be used for the real-time horizontal surface state of flight of four rotor crafts is fed back to computing machine; Gesture identifying device is made of a Natal depth camera and a computing machine, and degree of depth camera is used for catching the human body gesture, and by wire transmission mode gesture information is transferred to computing machine; Four rotor crafts and computing machine communicate by the ZigBee wireless data transmission module.
Four rotor craft system control methods based on human-computer interaction technology provided by the present invention, concrete steps are as follows:
Step 1: obtain the elevation information of four rotor crafts 1 by four rotor crafts, 1 airborne ultrasonic distance measuring module 13, this elevation information is treated to the decimal digit of representing with ASCII character through the airborne microcontroller 8 of four rotor crafts 1 and is transferred to computing machine 7 by ZigBee wireless communication module 14;
Step 2: catch the plan position information and the angle information of 2 infrared reflecting cloth 11 of four rotor crafts, 1 upper surface by wireless environment infrared camera 5, thereby obtain the plan position information and the yaw angle of four rotor crafts 1 and be transferred to computing machine 7;
Step 3: computing machine 7 comprehensively is x, y, z coordinate in wireless environment infrared camera 5 covering spaces with elevation information and the plan position information that is obtained, and so far obtains the current location spatial coordinates of four rotor crafts 1;
Step 4: take by Natal depth camera 6 operators hand depth image and be transferred to computing machine 7;
Step 5: 7 pairs of depth images that obtained of computing machine are resolved, and isolate operator's x, y, z coordinate and the operator of hand in Natal depth camera 6 coverages the thumb and the angle information of forefinger;
Step 6: computing machine 7 is mapped as the current location spatial coordinates of the operator's that step 5 obtained hand by space coordinate transformation the target location spatial coordinates of four rotor crafts, 1 place spatial dimension;
Step 7: computing machine 7 goes out PID controlling quantity at four rotor crafts, 1 locus according to the target location spatial coordinates calculation of four rotor crafts 1 that obtained in the spatial coordinates of the current location of four rotor crafts 1 that obtained in the step 3 and the step 6, comprise the forward-reverse controlling quantity, left and right sides translation controlling quantity, raise and reduce controlling quantity, angle information according to operator's thumb that is obtained in the step 5 and forefinger generates the controlling quantity that control manipulator 12 is mediated angle, calculate the yaw angle controlling quantity that is used to suppress 1 spin of four rotor crafts according to the yaw angle that step 2 obtained, be transferred to the airborne microcontroller 8 of four rotor crafts by ZigBee wireless communication module 14;
Step 8: forward-reverse, left and right sides translation, the rising that is obtained reduced the airborne microcontroller 8 of four rotor crafts and the yaw angle controlling quantity is transferred to flight attitude controller 9 with the pwm signal form, and the manipulator controlling quantity that is obtained is transferred to manipulator 12 with the pwm signal form;
Step 9: flight attitude controller 9 control information that is obtained is resolved to four tunnel motor drive signal of carrying out the corresponding actions task through electronic governor 15 handle with power gain after final four brushless motors, 10 controls, four rotor crafts 1 that drive move to assigned address, manipulator is mediated by specified angle and is finished grasping movement.
Than other aircraft control in corresponding field, the present invention has following beneficial effect:
The first, because the special structure design of four rotor crafts own, it comparatively is fit to the flexible operating of small space; Used space positioning system of the present invention again, its flight positioning action is just more accurate.More than 2 make that these four rotor crafts can be outstanding finish highly difficult fine movement, for example complementary work such as grasping body, character calligraph.
The second, because originally the man-machine interaction of four rotor craft control system is based on depth camera, upper computer can obtain and handle effectively the gesture attitude information comparatively exactly, thereby realizes operator's real-time gesture control.This The Application of Technology makes that the control of these four rotor crafts is flexible more, convenient, can finish complicated aircraft space by simple gesture space correspondence and control.
The present invention is described in further detail below in conjunction with description of drawings and the specific embodiment.
Description of drawings
Fig. 1 is four rotor craft system architecture scheme drawings based on human-computer interaction technology used in the present invention;
Fig. 2 is the birds-eye view of four rotor crafts
Fig. 3 is the upward view of four rotor crafts
Fig. 4 is a system chart of the present invention
Among the figure: 1-four rotor crafts, 2-planimetric position acquisition equipment, 3-gesture identifying device, two adjustable supports of parallel placement spacing of 4-, 5-wireless environment infrared camera, 6-Natal depth camera, the 7-computing machine, the airborne microcontroller of 8-, 9-flight attitude controller, the 10-brushless motor, 11-infrared reflecting cloth, 12-manipulator, 13-ultrasonic distance measuring module, the 14-ZigBee wireless communication module, the 15-electronic governor.
The specific embodiment
First is by the implementation method of the position control four rotor craft locus of mobile hand
Step 1: by trigger the airborne ultrasonic distance measuring module of four rotor crafts 13 obtain one with the time difference of energizing signal and the impulse singla of four rotor craft high-positive correlations, and measure the elevation information that this time difference obtains four rotor crafts by input capture by the airborne microcontroller MC9S12XS128 of four rotor crafts, be treated to the decimal digit of representing with ASCII character then and be transferred to computing machine by ZigBee wireless communication module 14;
Step 2: the infrared light (its all band is by infrared filter disc filtering) of catching 2 infrared reflecting cloth, 11 reflections of four rotor craft upper surface center symmetry arrangement by the environment infrared camera, through cmos sensor this optical signal is converted into the infrared image video flowing, this video flowing is transferred to computing machine by the wireless video transport module;
Step 3: computing machine is resolved the video flowing that is obtained by OpenCV:
1, with the infrared image gray processing;
2, search the ROI zone;
3, carry out gaussian filtering to carry out binaryzation;
4, detect Contours profile and calculate profile number and each independent profile is numbered, calculate x, y coordinate and the yaw angle of geometric centre in environment infrared camera visual field of four rotor crafts with this;
5, utilize the perspective principle of camera,, calculate the z coordinate of aircraft in conjunction with the elevation information that the airborne microcontroller of four rotor crafts is passed back from ultrasonic distance measuring module;
Step 4: take by the Natal depth camera operator hand depth image and be transferred to computing machine;
Step 5: computing machine is resolved the depth image that is obtained, and isolates operator's x, y, the z coordinate of hand in Natal depth camera coverage, so far obtains the current location spatial coordinates of operator's hand;
Step 6: computing machine is mapped as the current location spatial coordinates of the operator's that step 5 obtained hand by space coordinate transformation the target location spatial coordinates of four rotor craft place spatial dimensions;
Step 7: computing machine goes out the PID controlling quantity according to the target location spatial coordinates calculation of four rotor crafts that obtained in the spatial coordinates of the current location of four rotor crafts that obtained in the step 3 and the step 5, comprise forward-reverse controlling quantity, left and right sides translation controlling quantity, raise and reduce controlling quantity, and be transferred to the airborne microcontroller of four rotor crafts by the ZigBee wireless communication module;
Step 8: the airborne microcontroller of four rotor crafts is transferred to flight attitude controller 9 with forward-reverse, the left and right sides translation that is obtained, the reduction controlling quantity that raises with the pwm signal form;
Step 9: the flight attitude controller control information that obtained is resolved four tunnel motor drive signal of carrying out the corresponding actions task through electronic governor 15 handle with power gain after finally drive four brushless motors and control four rotor crafts and move to assigned address.
Second portion is controlled the implementation method that the airborne manipulator of four rotor crafts grasps object by grasping gesture
Step 1: take by the Natal depth camera operator hand depth image and be transferred to computing machine;
Step 2: computing machine is resolved the depth image that is obtained:
1, is reduced into the 3-D view of operator's hand by OpenGL;
2, use the Honda Youding Zhu of technical institute, the application Bayesian frame that Kikuo Fujimura researcher proposes extracts the angle information of operator's thumb and forefinger to the method for tracing of human body attitude in the depth image and is transferred to the airborne microcontroller of four rotor crafts by the ZigBee wireless communication module from 3-D view;
Step 3: the airborne microcontroller of four rotor crafts is the kneading distance that cooresponding steering wheel control signal is controlled the airborne manipulator of four rotor crafts with the angle information processing that is obtained, and finally finishes grasping movement.
Claims (1)
1. based on four rotor craft system control methods of human-computer interaction technology, concrete steps are as follows:
Step 1: the ultrasonic distance measuring module (13) airborne by four rotor crafts (1) obtains the elevation information of four rotor crafts (1), and this elevation information is treated to the decimal digit of representing with ASCII character through the airborne microcontroller (8) of four rotor crafts (1) and is transferred to computing machine (7) by ZigBee wireless communication module (14);
Step 2: catch the plan position information and the angle information of 2 infrared reflecting cloth (11) of four rotor crafts (1) upper surface by wireless environment infrared camera (5), thereby obtain the plan position information and the yaw angle of four rotor crafts (1) and be transferred to computing machine (7);
Step 3: computing machine (7) comprehensively is x, y, z coordinate in wireless environment infrared camera (5) covering space with elevation information and the plan position information that is obtained, and so far obtains the current location spatial coordinates of four rotor crafts (1);
Step 4: take by Natal depth camera (6) operator hand depth image and be transferred to computing machine (7);
Step 5: computing machine (7) is resolved the depth image that is obtained, and isolates operator's x, y, z coordinate and the operator of hand in Natal depth camera (6) coverage the thumb and the angle information of forefinger;
Step 6: computing machine (7) is mapped as the current location spatial coordinates of the operator's that step 5 obtained hand by space coordinate transformation the target location spatial coordinates of four rotor crafts (1) place spatial dimension;
Step 7: computing machine (7) goes out PID controlling quantity at four rotor crafts (1) locus according to the target location spatial coordinates calculation of four rotor crafts (1) that obtained in the spatial coordinates of the current location of four rotor crafts (1) that obtained in the step 3 and the step 6, comprise the forward-reverse controlling quantity, left and right sides translation controlling quantity, raise and reduce controlling quantity, angle information according to operator's thumb that is obtained in the step 5 and forefinger generates the controlling quantity that control manipulator (12) is mediated angle, calculate the yaw angle controlling quantity that is used to suppress four rotor crafts (1) spin according to the yaw angle that step 2 obtained, be transferred to the airborne microcontroller (8) of four rotor crafts by ZigBee wireless communication module (14);
Step 8: forward-reverse, left and right sides translation, the rising that is obtained reduced the airborne microcontroller (8) of four rotor crafts and the yaw angle controlling quantity is transferred to flight attitude controller (9) with the pwm signal form, and the manipulator controlling quantity that is obtained is transferred to manipulator (12) with the pwm signal form;
Step 9: flight attitude controller (9) control information that is obtained is resolved to four tunnel motor drive signal of carrying out the corresponding actions task through electronic governor (15) handle with power gain after final four brushless motors (10) control four rotor crafts (1) that drive move to assigned address, manipulator is mediated by specified angle and is finished grasping movement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110110471 CN102219051B (en) | 2011-04-29 | 2011-04-29 | Method for controlling four-rotor aircraft system based on human-computer interaction technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110110471 CN102219051B (en) | 2011-04-29 | 2011-04-29 | Method for controlling four-rotor aircraft system based on human-computer interaction technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102219051A true CN102219051A (en) | 2011-10-19 |
CN102219051B CN102219051B (en) | 2013-01-09 |
Family
ID=44775817
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 201110110471 Expired - Fee Related CN102219051B (en) | 2011-04-29 | 2011-04-29 | Method for controlling four-rotor aircraft system based on human-computer interaction technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102219051B (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102354345A (en) * | 2011-10-21 | 2012-02-15 | 北京理工大学 | Medical image browse device with somatosensory interaction mode |
CN103049007A (en) * | 2012-12-29 | 2013-04-17 | 安科智慧城市技术(中国)有限公司 | System and method for operating mobile platform based on head movement of operator |
CN103365294A (en) * | 2012-03-29 | 2013-10-23 | 鸿富锦精密工业(深圳)有限公司 | Unmanned aerial vehicle control system and method thereof |
CN103940823A (en) * | 2014-04-25 | 2014-07-23 | 河海大学常州校区 | Iron tower defect detection system based on aircraft and aircraft positioning method |
CN104581072A (en) * | 2015-01-05 | 2015-04-29 | 惠州市加迈电器有限公司 | Night shooting equipment |
CN104565876A (en) * | 2015-01-05 | 2015-04-29 | 惠州市加迈电器有限公司 | Portable lighting device |
CN104590554A (en) * | 2015-01-05 | 2015-05-06 | 惠州市加迈电器有限公司 | Path-finding lighting equipment |
CN104606901A (en) * | 2015-01-29 | 2015-05-13 | 赵旭 | Height-air-pressure-detection infrared-reflection-induction type toy air vehicle structure and application of height-air-pressure-detection infrared-reflection-induction type toy air vehicle structure |
CN104808799A (en) * | 2015-05-20 | 2015-07-29 | 成都通甲优博科技有限责任公司 | Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof |
CN104816829A (en) * | 2015-05-12 | 2015-08-05 | 广东澄星航模科技股份有限公司 | Skyeye aircraft applicable to investigation |
CN104898524A (en) * | 2015-06-12 | 2015-09-09 | 江苏数字鹰科技发展有限公司 | Unmanned plane remote control system based on gesture |
CN104965516A (en) * | 2015-06-15 | 2015-10-07 | 东南大学 | Method for controlling four-rotor aircraft to perform grabbing operation in the air based on visual and force feedback |
CN105321315A (en) * | 2014-07-30 | 2016-02-10 | 芯讯通无线科技(上海)有限公司 | Aircraft system and communication method thereof |
CN105334851A (en) * | 2014-08-12 | 2016-02-17 | 深圳市银星智能科技股份有限公司 | Mobile device capable of sensing gesture |
CN105607740A (en) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | Unmanned aerial vehicle control method and device based on computer vision |
CN105843252A (en) * | 2016-03-30 | 2016-08-10 | 乐视控股(北京)有限公司 | Unmanned aircraft system and flight control method thereof |
CN105843251A (en) * | 2016-03-28 | 2016-08-10 | 沈阳上博智拓科技有限公司 | Method, device, and system used for controlling aircraft |
CN106054911A (en) * | 2016-07-22 | 2016-10-26 | 华南理工大学 | Quadrocopter capable of gesture identification operation and method thereof |
CN106143927A (en) * | 2015-04-03 | 2016-11-23 | 云南师范大学 | A kind of indoor four-axle aircraft closed-loop control system based on Kinect |
CN106774365A (en) * | 2016-12-20 | 2017-05-31 | 华中科技大学 | A kind of unmanned plane debugging system based on external view positioning |
CN107645599A (en) * | 2017-09-21 | 2018-01-30 | 宇龙计算机通信科技(深圳)有限公司 | A kind of control method, terminal and computer-readable recording medium |
CN107831791A (en) * | 2017-11-17 | 2018-03-23 | 南方科技大学 | A kind of control method of unmanned plane, device, controlling equipment and storage medium |
CN108214509A (en) * | 2016-07-12 | 2018-06-29 | 泉州臻美智能科技有限公司 | A kind of robot |
CN109176559A (en) * | 2018-10-24 | 2019-01-11 | 上海思依暄机器人科技股份有限公司 | A kind of robot and control method, system |
CN109415119A (en) * | 2016-04-08 | 2019-03-01 | 列奥纳多股份公司 | Method of the rotor and detection blade for the aircraft that can be hovered relative to the posture of the hub of this rotor |
CN110687902A (en) * | 2016-12-21 | 2020-01-14 | 杭州零零科技有限公司 | System and method for controller-free user drone interaction |
CN110825121A (en) * | 2018-08-08 | 2020-02-21 | 纬创资通股份有限公司 | Control device and unmanned aerial vehicle control method |
CN112947589A (en) * | 2021-03-10 | 2021-06-11 | 南京理工大学 | Indoor four-rotor unmanned aerial vehicle based on dual-core DSP gesture control |
CN113282110A (en) * | 2021-07-23 | 2021-08-20 | 季华实验室 | Flying robot and human cooperative operation method and device and flying robot |
CN113282109A (en) * | 2021-07-23 | 2021-08-20 | 季华实验室 | Unmanned aerial vehicle and human cooperative operation system |
CN116337086A (en) * | 2023-05-29 | 2023-06-27 | 中国人民解放军海军工程大学 | Method, system, medium and terminal for calculating optimal capturing position of unmanned aerial vehicle network capturing |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4490794A (en) * | 1982-01-18 | 1984-12-25 | Sperry Corporation | Altitude preselect system for aircraft without air data computer |
US4829220A (en) * | 1986-09-15 | 1989-05-09 | Aserospatiale Societe Nationale Industrielle | Electric flight control system with aircraft incidence protection |
US6390417B1 (en) * | 1999-06-30 | 2002-05-21 | Honda Giken Kogyo Kabushiki Kaisha | Drag control system for flying machine, process for estimating drag of flying machine, boundary layer control system, and boundary layer control process |
CN101699510A (en) * | 2009-09-02 | 2010-04-28 | 北京科技大学 | Particle filtering-based pupil tracking method in sight tracking system |
CN102012741A (en) * | 2010-11-16 | 2011-04-13 | 吕文阁 | Camera device-based multimedia human-computer interaction method and system |
-
2011
- 2011-04-29 CN CN 201110110471 patent/CN102219051B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4490794A (en) * | 1982-01-18 | 1984-12-25 | Sperry Corporation | Altitude preselect system for aircraft without air data computer |
US4829220A (en) * | 1986-09-15 | 1989-05-09 | Aserospatiale Societe Nationale Industrielle | Electric flight control system with aircraft incidence protection |
US6390417B1 (en) * | 1999-06-30 | 2002-05-21 | Honda Giken Kogyo Kabushiki Kaisha | Drag control system for flying machine, process for estimating drag of flying machine, boundary layer control system, and boundary layer control process |
CN101699510A (en) * | 2009-09-02 | 2010-04-28 | 北京科技大学 | Particle filtering-based pupil tracking method in sight tracking system |
CN102012741A (en) * | 2010-11-16 | 2011-04-13 | 吕文阁 | Camera device-based multimedia human-computer interaction method and system |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102354345A (en) * | 2011-10-21 | 2012-02-15 | 北京理工大学 | Medical image browse device with somatosensory interaction mode |
CN103365294A (en) * | 2012-03-29 | 2013-10-23 | 鸿富锦精密工业(深圳)有限公司 | Unmanned aerial vehicle control system and method thereof |
CN103049007A (en) * | 2012-12-29 | 2013-04-17 | 安科智慧城市技术(中国)有限公司 | System and method for operating mobile platform based on head movement of operator |
CN103940823A (en) * | 2014-04-25 | 2014-07-23 | 河海大学常州校区 | Iron tower defect detection system based on aircraft and aircraft positioning method |
CN105321315A (en) * | 2014-07-30 | 2016-02-10 | 芯讯通无线科技(上海)有限公司 | Aircraft system and communication method thereof |
CN105334851A (en) * | 2014-08-12 | 2016-02-17 | 深圳市银星智能科技股份有限公司 | Mobile device capable of sensing gesture |
CN104581072A (en) * | 2015-01-05 | 2015-04-29 | 惠州市加迈电器有限公司 | Night shooting equipment |
CN104565876A (en) * | 2015-01-05 | 2015-04-29 | 惠州市加迈电器有限公司 | Portable lighting device |
CN104590554A (en) * | 2015-01-05 | 2015-05-06 | 惠州市加迈电器有限公司 | Path-finding lighting equipment |
CN104581072B (en) * | 2015-01-05 | 2018-10-02 | 嘉兴欧祥通讯设备有限公司 | Night capture apparatus |
CN104606901A (en) * | 2015-01-29 | 2015-05-13 | 赵旭 | Height-air-pressure-detection infrared-reflection-induction type toy air vehicle structure and application of height-air-pressure-detection infrared-reflection-induction type toy air vehicle structure |
CN106143927A (en) * | 2015-04-03 | 2016-11-23 | 云南师范大学 | A kind of indoor four-axle aircraft closed-loop control system based on Kinect |
CN104816829A (en) * | 2015-05-12 | 2015-08-05 | 广东澄星航模科技股份有限公司 | Skyeye aircraft applicable to investigation |
CN104808799A (en) * | 2015-05-20 | 2015-07-29 | 成都通甲优博科技有限责任公司 | Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof |
CN104898524A (en) * | 2015-06-12 | 2015-09-09 | 江苏数字鹰科技发展有限公司 | Unmanned plane remote control system based on gesture |
CN104898524B (en) * | 2015-06-12 | 2018-01-09 | 江苏数字鹰科技发展有限公司 | No-manned machine distant control system based on gesture |
CN104965516A (en) * | 2015-06-15 | 2015-10-07 | 东南大学 | Method for controlling four-rotor aircraft to perform grabbing operation in the air based on visual and force feedback |
CN105607740A (en) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | Unmanned aerial vehicle control method and device based on computer vision |
CN105843251A (en) * | 2016-03-28 | 2016-08-10 | 沈阳上博智拓科技有限公司 | Method, device, and system used for controlling aircraft |
CN105843251B (en) * | 2016-03-28 | 2020-04-07 | 沈阳上博智拓科技有限公司 | Method, device and system for controlling aircraft |
CN105843252A (en) * | 2016-03-30 | 2016-08-10 | 乐视控股(北京)有限公司 | Unmanned aircraft system and flight control method thereof |
CN109415119A (en) * | 2016-04-08 | 2019-03-01 | 列奥纳多股份公司 | Method of the rotor and detection blade for the aircraft that can be hovered relative to the posture of the hub of this rotor |
CN108214509A (en) * | 2016-07-12 | 2018-06-29 | 泉州臻美智能科技有限公司 | A kind of robot |
CN106054911A (en) * | 2016-07-22 | 2016-10-26 | 华南理工大学 | Quadrocopter capable of gesture identification operation and method thereof |
CN106774365A (en) * | 2016-12-20 | 2017-05-31 | 华中科技大学 | A kind of unmanned plane debugging system based on external view positioning |
CN110687902B (en) * | 2016-12-21 | 2020-10-20 | 杭州零零科技有限公司 | System and method for controller-free user drone interaction |
CN110687902A (en) * | 2016-12-21 | 2020-01-14 | 杭州零零科技有限公司 | System and method for controller-free user drone interaction |
CN107645599A (en) * | 2017-09-21 | 2018-01-30 | 宇龙计算机通信科技(深圳)有限公司 | A kind of control method, terminal and computer-readable recording medium |
CN107831791B (en) * | 2017-11-17 | 2020-12-15 | 深圳意动航空科技有限公司 | Unmanned aerial vehicle control method and device, control equipment and storage medium |
CN107831791A (en) * | 2017-11-17 | 2018-03-23 | 南方科技大学 | A kind of control method of unmanned plane, device, controlling equipment and storage medium |
CN110825121A (en) * | 2018-08-08 | 2020-02-21 | 纬创资通股份有限公司 | Control device and unmanned aerial vehicle control method |
CN110825121B (en) * | 2018-08-08 | 2023-02-17 | 纬创资通股份有限公司 | Control device and unmanned aerial vehicle control method |
CN109176559A (en) * | 2018-10-24 | 2019-01-11 | 上海思依暄机器人科技股份有限公司 | A kind of robot and control method, system |
CN112947589A (en) * | 2021-03-10 | 2021-06-11 | 南京理工大学 | Indoor four-rotor unmanned aerial vehicle based on dual-core DSP gesture control |
CN113282110A (en) * | 2021-07-23 | 2021-08-20 | 季华实验室 | Flying robot and human cooperative operation method and device and flying robot |
CN113282109A (en) * | 2021-07-23 | 2021-08-20 | 季华实验室 | Unmanned aerial vehicle and human cooperative operation system |
CN113282110B (en) * | 2021-07-23 | 2021-10-22 | 季华实验室 | Flying robot and human cooperative operation method and device and flying robot |
CN116337086A (en) * | 2023-05-29 | 2023-06-27 | 中国人民解放军海军工程大学 | Method, system, medium and terminal for calculating optimal capturing position of unmanned aerial vehicle network capturing |
CN116337086B (en) * | 2023-05-29 | 2023-08-04 | 中国人民解放军海军工程大学 | Method, system, medium and terminal for calculating optimal capturing position of unmanned aerial vehicle network capturing |
Also Published As
Publication number | Publication date |
---|---|
CN102219051B (en) | 2013-01-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102219051B (en) | Method for controlling four-rotor aircraft system based on human-computer interaction technology | |
CN204997657U (en) | Biomimetic mechanical hand with imitate function | |
CN111694428B (en) | Gesture and track remote control robot system based on Kinect | |
CN104589356B (en) | The Dextrous Hand remote operating control method caught based on Kinect human hand movement | |
CN110480634B (en) | Arm guide motion control method for mechanical arm motion control | |
Frank et al. | Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks | |
CN104950902B (en) | The control method and multi-rotor aerocraft of multi-rotor aerocraft | |
CN111055281A (en) | ROS-based autonomous mobile grabbing system and method | |
CN104570731A (en) | Uncalibrated human-computer interaction control system and method based on Kinect | |
CN104898524A (en) | Unmanned plane remote control system based on gesture | |
CN100361792C (en) | Controlling system of movable manipulator | |
CN111462154A (en) | Target positioning method and device based on depth vision sensor and automatic grabbing robot | |
US20130202212A1 (en) | Information processing apparatus, information processing method, and computer program | |
CN106020494B (en) | Three-dimensional gesture recognition method based on mobile tracking | |
CN110640742A (en) | Industrial robot platform of multi-mode control | |
CN103853464A (en) | Kinect-based railway hand signal identification method | |
CN105234940A (en) | Robot and control method thereof | |
CN2645862Y (en) | Mobile mechanical arm system | |
Raj et al. | Static gesture recognition based precise positioning of 5-DOF robotic arm using FPGA | |
Teke et al. | Real-time and robust collaborative robot motion control with Microsoft Kinect® v2 | |
CN103862476A (en) | Position inverse solution method for mechanical arm with six freedom degrees | |
Yang et al. | Development of flight simulation system based on leap motion controller | |
Jo et al. | Tracking and interaction based on hybrid sensing for virtual environments | |
Liu et al. | A deep-learning based multi-modality sensor calibration method for usv | |
CN116476074A (en) | Remote mechanical arm operation system based on mixed reality technology and man-machine interaction method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20130109 Termination date: 20140429 |