CN102219051B - Method for controlling four-rotor aircraft system based on human-computer interaction technology - Google Patents
Method for controlling four-rotor aircraft system based on human-computer interaction technology Download PDFInfo
- Publication number
- CN102219051B CN102219051B CN 201110110471 CN201110110471A CN102219051B CN 102219051 B CN102219051 B CN 102219051B CN 201110110471 CN201110110471 CN 201110110471 CN 201110110471 A CN201110110471 A CN 201110110471A CN 102219051 B CN102219051 B CN 102219051B
- Authority
- CN
- China
- Prior art keywords
- quadrotor
- obtains
- computing machine
- control
- transferred
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Manipulator (AREA)
- Toys (AREA)
- Selective Calling Equipment (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A method for controlling a four-rotor aircraft system based on the human-computer interaction technology, which belongs to the intelligent flying robot field, is characterized in that a manipulator can control the four-rotor aircraft by gestures. The flight attitude control is accomplished by the cooperative running of the four rotors which are arranged at geometrical vertexes of the four-rotor aircraft with three degrees of freedom being yaw angle, pitch angle and roll angle. The human-computer interaction technology mainly utilizes OpenCV and OpenGL. The system captures depth images of the manipulator hands through a depth camera; the depth images are analyzed and processed by the computer to obtain gesture information and generate control signals corresponding to the gesture information; and then the control signals are sent through a radio communication device to the aircraft for execution, so as to accomplish the mapping from the motion state of the manipulator hands to the motion state of the aircraft and complete the gesture control. With the help of far controlling distance and more visual gesture corresponding relation, the gesture control can be applied to danger experiments and industrial production processes with high execution difficulty.
Description
Technical field
The invention belongs to the Intelligent flight robot field, be specifically related to a kind of degree of depth camera that utilizes and catch the method that the human body gesture is finished quadrotor control.
Technical background
Quadrotor is operand of the present invention, and it is a kind of aircraft with four screw propellers, and four screw propellers are the square crossing structure, and two group of four relative rotor has identical hand of rotation.Quadrotor seldom has comparatively ripe achievement to emerge also for the budding stage at home; But quadrotor is comparatively popular in the world, and many famous brand colleges and universities in west all obtain the leading position in quadrotor control, also has many fans to join in the middle of the manufacture and research of quadrotor.Quadrotor has been won vast aircraft fan's favor with the advantage that its volume is little, lightweight, flexibility ratio is high.
Control device of the present invention is the gesture identification of degree of depth camera, utilization be Natal degree of depth camera.Spatial image is processed the solution of more existing moulding, but all because its complicated algorithm and huge computing storehouse the researchist in many peripheral fields is kept outside of the door.Natal degree of depth camera that the present invention uses has used a visible image capturing head, infrared camera and infrared dot generator to finish the advanced treating of spatial image dexterously, is the gesture identification condition of providing convenience.Technique has begun to use to some extent in field of machine vision at present, and is well received by the public.
Summary of the invention
Thereby the object of the present invention is to provide a kind of method of utilizing depth camera to catch human body gesture control attitude of flight vehicle, demonstrate fully alerting ability and the convenience of the control of operator's gesture, reduce the difficulty of generally controlling for aircraft, expand the field of application of small aircraft.
Based on above-mentioned purpose, conception plans of the present invention is: consist of control system with quadrotor, planimetric position acquisition equipment and gesture identifying device.Its median-plane location acquisition equipment comprises the adjustable support of environment infrared camera placement spacing parallel with a group, communicate by the wireless video transmission module between environment infrared camera and the hinge computing machine, be used for Real-time Water plane flying feedback of status with quadrotor to computing machine; Gesture identifying device is made of a Natal depth camera and a computing machine, and degree of depth camera is used for catching the human body gesture, and by wire transmission mode gesture information is transferred to computing machine; Quadrotor and computing machine communicate by the ZigBee wireless data transmission module.
Quadrotor system control method based on human-computer interaction technology provided by the present invention, concrete steps are as follows:
Step 1: obtain the elevation information of quadrotor 1 by quadrotor 1 airborne ultrasonic distance measuring module 13, this elevation information is treated to the decimal digit that represents with ASCII character through the airborne microcontroller 8 of quadrotor 1 and is transferred to computing machine 7 by ZigBee wireless communication module 14;
Step 2: catch plan position information and the angle information of 2 infrared reflecting cloth 11 of quadrotor 1 upper surface by wireless environment infrared camera 5, thereby obtain plan position information and the yaw angle of quadrotor 1 and be transferred to computing machine 7;
Step 3: computing machine 7 comprehensively is x, y, z coordinate in wireless environment infrared camera 5 covering spaces with elevation information and the plan position information that obtains, and so far obtains the current location spatial coordinates of quadrotor 1;
Step 4: take by Natal depth camera 6 operators hand depth image and be transferred to computing machine 7;
Step 5: 7 pairs of depth images that obtain of computing machine are resolved, and isolate operator's x, y, z coordinate and the operator of hand in Natal depth camera 6 coverages thumb and the angle information of forefinger;
Step 6: the current location spatial coordinates of the operator's that computing machine 7 obtains step 5 by space coordinate transformation hand is mapped as the target location spatial coordinates of quadrotor 1 place spatial dimension;
Step 7: computing machine 7 goes out PID controlling quantity for quadrotor 1 locus according to the target location spatial coordinates calculation of the quadrotor 1 that obtains in the spatial coordinates of the current location of the quadrotor 1 that obtains in the step 3 and the step 6, comprise the forward-reverse controlling quantity, left and right sides translation controlling quantity, raise and reduce controlling quantity, mediate the controlling quantity of angle according to the angle Information generation control manipulator 12 of the operator's thumb that obtains in the step 5 and forefinger, the yaw angle that obtains according to step 2 calculates for the yaw angle controlling quantity that suppresses quadrotor 1 spin, is transferred to the airborne microcontroller 8 of quadrotor by ZigBee wireless communication module 14;
Step 8: forward-reverse, left and right sides translation, the rising that obtains reduced the airborne microcontroller 8 of quadrotor and the yaw angle controlling quantity is transferred to Attitude Controller 9 with the pwm signal form, and the manipulator control amount that obtains is transferred to manipulator 12 with the pwm signal form;
Step 9: Attitude Controller 9 control information that obtains is resolved to four tunnel motor drive signal of carrying out the corresponding actions task through electronic governor 15 process with power gain after final four brushless motors, the 10 control quadrotors 1 that drive move to assigned address, manipulator is mediated by specified angle and is finished grasping movement.
Than other aircraft control in corresponding field, the present invention has following beneficial effect:
The first, because the unique structure design of quadrotor itself, it comparatively is fit to the flexible operating of small space; Used space positioning system of the present invention, its flight positioning action is just more accurate again.Above 2 so that this quadrotor can be outstanding finish highly difficult fine movement, complementary work such as grasping body, character calligraph.
The second, because the man-machine interaction of this quadrotor control system is based on depth camera, upper computer can obtain and effectively process the gesture attitude information comparatively exactly, thereby realizes operator's real-time gesture control.The application of technique can be finished complicated aircraft space by simple gesture space corresponding and control so that the control of this quadrotor is more flexible, convenient.
The present invention is described in further detail below in conjunction with description of drawings and the specific embodiment.
Description of drawings
Fig. 1 is the quadrotor system architecture schematic diagram based on human-computer interaction technology used in the present invention;
Fig. 2 is the birds-eye view of quadrotor
Fig. 3 is the upward view of quadrotor
Fig. 4 is system chart of the present invention
Among the figure: 1-quadrotor, 2-planimetric position acquisition equipment, 3-gesture identifying device, two adjustable supports of parallel placement spacing of 4-, 5-wireless environment infrared camera, 6-Natal depth camera, the 7-computing machine, the airborne microcontroller of 8-, 9-Attitude Controller, the 10-brushless motor, 11-infrared reflecting cloth, 12-manipulator, 13-ultrasonic distance measuring module, the 14-ZigBee wireless communication module, the 15-electronic governor.
The specific embodiment
First is by the implementation method of the position control quadrotor locus of mobile hand
Step 1: by trigger the airborne ultrasonic distance measuring module 13 of quadrotor obtain one with the time difference of energizing signal and the impulse singla of quadrotor high-positive correlation, and measure the elevation information that this time difference obtains quadrotor by the airborne microcontroller MC9S12XS128 of quadrotor by input capture, then be treated to the decimal digit that represents with ASCII character and be transferred to computing machine by ZigBee wireless communication module 14;
Step 2: the infrared light (its all band is by infrared filter disc filtering) of catching 2 infrared reflecting cloth, 11 reflections that quadrotor upper surface Central Symmetry arranges by the environment infrared camera, through cmos sensor this optical signal is converted into the infrared image video flowing, this video flowing is transferred to computing machine by the wireless video transmission module;
Step 3: computing machine is resolved the video flowing that obtains by OpenCV:
1, with the infrared image gray processing;
2, search the ROI zone;
3, carry out gaussian filtering to carry out binaryzation;
4, detect Contours profile and calculate profile number and each independent profile is numbered, calculate x, y coordinate and the yaw angle of geometric centre in environment infrared camera visual field of quadrotor with this;
5, utilize the Perspective Principles of camera, in conjunction with the elevation information from ultrasonic distance measuring module that the airborne microcontroller of quadrotor is passed back, calculate the z coordinate of aircraft;
Step 4: take by the Natal depth camera operator hand depth image and be transferred to computing machine;
Step 5: computing machine is resolved the depth image that obtains, and isolates operator's x, y, the z coordinate of hand in Natal depth camera coverage, so far obtains the current location spatial coordinates of operator's hand;
Step 6: the current location spatial coordinates of the operator's that computing machine obtains step 5 by space coordinate transformation hand is mapped as the target location spatial coordinates of quadrotor place spatial dimension;
Step 7: computing machine goes out the PID controlling quantity according to the target location spatial coordinates calculation of the quadrotor that obtains in the spatial coordinates of the current location of the quadrotor that obtains in the step 3 and the step 5, comprise forward-reverse controlling quantity, left and right sides translation controlling quantity, raise and reduce controlling quantity, and be transferred to the airborne microcontroller of quadrotor by the ZigBee wireless communication module;
Step 8: the airborne microcontroller of quadrotor is transferred to Attitude Controller 9 with forward-reverse, the left and right sides translation that obtains, the reduction controlling quantity that raises with the pwm signal form;
Step 9: Attitude Controller the control information that obtains is resolved four tunnel motor drive signal of carrying out the corresponding actions task through electronic governor 15 process with power gain after finally drive four brushless motors control quadrotors and move to assigned address.
Second portion is by the implementation method of the airborne manipulator crawl of crawl gesture control quadrotor object
Step 1: take by the Natal depth camera operator hand depth image and be transferred to computing machine;
Step 2: computing machine is resolved the depth image that obtains:
1, is reduced into the 3-D view of operator's hand by OpenGL;
2, use the Honda Youding Zhu of technical institute, the application Bayesian frame that Kikuo Fujimura researcher proposes extracts the angle information of operator's thumb and forefinger to the method for tracing of human body attitude in the depth image and is transferred to the airborne microcontroller of quadrotor by the ZigBee wireless communication module from 3-D view;
Step 3: the kneading distance of the airborne manipulator of quadrotor controlled the angle information processing that obtains by the airborne microcontroller of quadrotor for corresponding steering wheel control signal, finally finish grasping movement.
Claims (1)
1. based on the quadrotor system control method of human-computer interaction technology, concrete steps are as follows:
Step 1: the ultrasonic distance measuring module (13) airborne by quadrotor (1) obtains the elevation information of quadrotor (1), and this elevation information is treated to the decimal digit that represents with ASCII character through the airborne microcontroller (8) of quadrotor (1) and is transferred to computing machine (7) by ZigBee wireless communication module (14);
Step 2: catch plan position information and the angle information of 2 infrared reflecting cloth (11) of quadrotor (1) upper surface by wireless environment infrared camera (5), thereby obtain plan position information and the yaw angle of quadrotor (1) and be transferred to computing machine (7);
Step 3: computing machine (7) comprehensively is x, y, z coordinate in wireless environment infrared camera (5) covering space with elevation information and the plan position information that obtains, and so far obtains the current location spatial coordinates of quadrotor (1);
Step 4: take by Natal depth camera (6) operator hand depth image and be transferred to computing machine (7);
Step 5: computing machine (7) is resolved the depth image that obtains, and isolates operator's x, y, z coordinate and the operator of hand in Natal depth camera (6) coverage thumb and the angle information of forefinger;
Step 6: the current location spatial coordinates of the operator's that computing machine (7) obtains step 5 by space coordinate transformation hand is mapped as the target location spatial coordinates of quadrotor (1) place spatial dimension;
Step 7: computing machine (7) goes out PID controlling quantity for quadrotor (1) locus according to the target location spatial coordinates calculation of the quadrotor (1) that obtains in the spatial coordinates of the current location of the quadrotor that obtains in the step 3 (1) and the step 6, comprise the forward-reverse controlling quantity, left and right sides translation controlling quantity, raise and reduce controlling quantity, mediate the controlling quantity of angle according to the angle Information generation control manipulator (12) of the operator's thumb that obtains in the step 5 and forefinger, the yaw angle that obtains according to step 2 calculates for the yaw angle controlling quantity that suppresses quadrotor (1) spin, is transferred to the airborne microcontroller (8) of quadrotor by ZigBee wireless communication module (14);
Step 8: forward-reverse, left and right sides translation, the rising that obtains reduced the airborne microcontroller (8) of quadrotor and the yaw angle controlling quantity is transferred to Attitude Controller (9) with the pwm signal form, and the manipulator control amount that obtains is transferred to manipulator (12) with the pwm signal form;
Step 9: Attitude Controller (9) control information that obtains is resolved to four tunnel motor drive signal of carrying out the corresponding actions task through electronic governor (15) process with power gain after final four brushless motors (10) the control quadrotors (1) that drive move to assigned address, manipulator is mediated by specified angle and is finished grasping movement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110110471 CN102219051B (en) | 2011-04-29 | 2011-04-29 | Method for controlling four-rotor aircraft system based on human-computer interaction technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110110471 CN102219051B (en) | 2011-04-29 | 2011-04-29 | Method for controlling four-rotor aircraft system based on human-computer interaction technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102219051A CN102219051A (en) | 2011-10-19 |
CN102219051B true CN102219051B (en) | 2013-01-09 |
Family
ID=44775817
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 201110110471 Expired - Fee Related CN102219051B (en) | 2011-04-29 | 2011-04-29 | Method for controlling four-rotor aircraft system based on human-computer interaction technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102219051B (en) |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102354345A (en) * | 2011-10-21 | 2012-02-15 | 北京理工大学 | Medical image browse device with somatosensory interaction mode |
CN103365294A (en) * | 2012-03-29 | 2013-10-23 | 鸿富锦精密工业(深圳)有限公司 | Unmanned aerial vehicle control system and method thereof |
CN103940823A (en) * | 2014-04-25 | 2014-07-23 | 河海大学常州校区 | Iron tower defect detection system based on aircraft and aircraft positioning method |
CN105321315A (en) * | 2014-07-30 | 2016-02-10 | 芯讯通无线科技(上海)有限公司 | Aircraft system and communication method thereof |
CN105334851A (en) * | 2014-08-12 | 2016-02-17 | 深圳市银星智能科技股份有限公司 | Mobile device capable of sensing gesture |
CN104581072B (en) * | 2015-01-05 | 2018-10-02 | 嘉兴欧祥通讯设备有限公司 | Night capture apparatus |
CN104590554B (en) * | 2015-01-05 | 2017-02-22 | 张晶 | Path-finding lighting equipment |
CN104565876B (en) * | 2015-01-05 | 2018-09-28 | 永康市华格工业产品设计有限公司 | Portable lighting device |
CN104606901A (en) * | 2015-01-29 | 2015-05-13 | 赵旭 | Height-air-pressure-detection infrared-reflection-induction type toy air vehicle structure and application of height-air-pressure-detection infrared-reflection-induction type toy air vehicle structure |
CN106143927A (en) * | 2015-04-03 | 2016-11-23 | 云南师范大学 | A kind of indoor four-axle aircraft closed-loop control system based on Kinect |
CN104816829B (en) * | 2015-05-12 | 2017-01-18 | 广东澄星航模科技股份有限公司 | Skyeye aircraft applicable to investigation |
CN104808799A (en) * | 2015-05-20 | 2015-07-29 | 成都通甲优博科技有限责任公司 | Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof |
CN104898524B (en) * | 2015-06-12 | 2018-01-09 | 江苏数字鹰科技发展有限公司 | No-manned machine distant control system based on gesture |
CN104965516B (en) * | 2015-06-15 | 2017-10-10 | 东南大学 | A kind of view-based access control model and the quadrotor of force feedback carry out the control method of aerial crawl operation |
CN105607740A (en) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | Unmanned aerial vehicle control method and device based on computer vision |
CN105843251B (en) * | 2016-03-28 | 2020-04-07 | 沈阳上博智拓科技有限公司 | Method, device and system for controlling aircraft |
CN105843252A (en) * | 2016-03-30 | 2016-08-10 | 乐视控股(北京)有限公司 | Unmanned aircraft system and flight control method thereof |
EP3228541B1 (en) * | 2016-04-08 | 2018-06-13 | LEONARDO S.p.A. | Rotor for a hover-capable aircraft and method for detecting the attitude of a blade with respect to a hub of such a rotor |
CN108214509B (en) * | 2016-07-12 | 2020-12-15 | 浙江昌新生物纤维股份有限公司 | Robot |
CN106054911A (en) * | 2016-07-22 | 2016-10-26 | 华南理工大学 | Quadrocopter capable of gesture identification operation and method thereof |
CN106774365A (en) * | 2016-12-20 | 2017-05-31 | 华中科技大学 | A kind of unmanned plane debugging system based on external view positioning |
US10409276B2 (en) * | 2016-12-21 | 2019-09-10 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for controller-free user drone interaction |
CN107645599A (en) * | 2017-09-21 | 2018-01-30 | 宇龙计算机通信科技(深圳)有限公司 | A kind of control method, terminal and computer-readable recording medium |
CN107831791B (en) * | 2017-11-17 | 2020-12-15 | 深圳意动航空科技有限公司 | Unmanned aerial vehicle control method and device, control equipment and storage medium |
CN110825121B (en) * | 2018-08-08 | 2023-02-17 | 纬创资通股份有限公司 | Control device and unmanned aerial vehicle control method |
CN109176559B (en) * | 2018-10-24 | 2022-05-24 | 上海思依暄机器人科技股份有限公司 | Robot and control method and system |
CN112947589A (en) * | 2021-03-10 | 2021-06-11 | 南京理工大学 | Indoor four-rotor unmanned aerial vehicle based on dual-core DSP gesture control |
CN113282110B (en) * | 2021-07-23 | 2021-10-22 | 季华实验室 | Flying robot and human cooperative operation method and device and flying robot |
CN113282109A (en) * | 2021-07-23 | 2021-08-20 | 季华实验室 | Unmanned aerial vehicle and human cooperative operation system |
CN116337086B (en) * | 2023-05-29 | 2023-08-04 | 中国人民解放军海军工程大学 | Method, system, medium and terminal for calculating optimal capturing position of unmanned aerial vehicle network capturing |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4490794A (en) * | 1982-01-18 | 1984-12-25 | Sperry Corporation | Altitude preselect system for aircraft without air data computer |
FR2604001B1 (en) * | 1986-09-15 | 1988-12-09 | Aerospatiale | ELECTRIC FLIGHT CONTROL SYSTEM WITH INCIDENT PROTECTION FOR AIRCRAFT |
US6390417B1 (en) * | 1999-06-30 | 2002-05-21 | Honda Giken Kogyo Kabushiki Kaisha | Drag control system for flying machine, process for estimating drag of flying machine, boundary layer control system, and boundary layer control process |
CN101699510A (en) * | 2009-09-02 | 2010-04-28 | 北京科技大学 | Particle filtering-based pupil tracking method in sight tracking system |
CN102012741A (en) * | 2010-11-16 | 2011-04-13 | 吕文阁 | Camera device-based multimedia human-computer interaction method and system |
-
2011
- 2011-04-29 CN CN 201110110471 patent/CN102219051B/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
CN102219051A (en) | 2011-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102219051B (en) | Method for controlling four-rotor aircraft system based on human-computer interaction technology | |
CN106826838B (en) | Interaction bionic mechanical arm control method based on Kinect visual depth sensor | |
CN104589356B (en) | The Dextrous Hand remote operating control method caught based on Kinect human hand movement | |
CN111694428B (en) | Gesture and track remote control robot system based on Kinect | |
CN107756400B (en) | 6R robot inverse kinematics geometric solving method based on momentum theory | |
CN104950695B (en) | A kind of general unmanned plane vision emulation platform | |
CN111055281A (en) | ROS-based autonomous mobile grabbing system and method | |
CN104570731A (en) | Uncalibrated human-computer interaction control system and method based on Kinect | |
CN111462154A (en) | Target positioning method and device based on depth vision sensor and automatic grabbing robot | |
CN104898524A (en) | Unmanned plane remote control system based on gesture | |
CN106863324A (en) | A kind of service robot platform of view-based access control model | |
CN103894583A (en) | Casting six-shaft manipulator and control method thereof | |
CN103853464A (en) | Kinect-based railway hand signal identification method | |
CN102830798A (en) | Mark-free hand tracking method of single-arm robot based on Kinect | |
CN2645862Y (en) | Mobile mechanical arm system | |
CN114791765B (en) | ROS intelligent vehicle interaction method based on mixed reality technology | |
Zou et al. | Design of smart car control system for gesture recognition based on Arduino | |
CN103862476B (en) | A kind of position-based routing method of six degree of freedom mechanical hand | |
Teke et al. | Real-time and robust collaborative robot motion control with Microsoft Kinect® v2 | |
Gao et al. | Kinect-based motion recognition tracking robotic arm platform | |
CN209648706U (en) | A kind of robot control system | |
Wu et al. | Kinect-based robotic manipulation: From human hand to end-effector | |
Pagonis et al. | Design, Fabrication and Simulation of a 5-dof Robotic Arm using Machine Vision | |
Celik et al. | Development of a robotic-arm controller by using hand gesture recognition | |
CN110363273B (en) | Interactive characteristic modeling method based on advanced intelligent object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20130109 Termination date: 20140429 |