CN111124126A - Unmanned aerial vehicle gesture control method - Google Patents

Unmanned aerial vehicle gesture control method Download PDF

Info

Publication number
CN111124126A
CN111124126A CN201911359636.4A CN201911359636A CN111124126A CN 111124126 A CN111124126 A CN 111124126A CN 201911359636 A CN201911359636 A CN 201911359636A CN 111124126 A CN111124126 A CN 111124126A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
palm
bending
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911359636.4A
Other languages
Chinese (zh)
Inventor
崔剑
刘康祺
朱杰
张安迪
李泽波
李舒婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201911359636.4A priority Critical patent/CN111124126A/en
Publication of CN111124126A publication Critical patent/CN111124126A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture control method for an unmanned aerial vehicle, and belongs to the field of unmanned aerial vehicle control. Firstly, an unmanned aerial vehicle gesture control device is constructed on a glove on the right hand, and an unmanned aerial vehicle instruction execution module is installed on an unmanned aerial vehicle body; through setting for the threshold value, judge that crooked sensor among the unmanned aerial vehicle gesture controlling means is in unbending or crooked state. Then, judging the inclined posture of the palm according to the X-axis acceleration component data and the Y-axis acceleration component data of an acceleration sensor in the unmanned aerial vehicle gesture control device; and combining and inputting the bending information and the palm state information into a PIC singlechip to obtain gesture states corresponding to the combined information and instructions corresponding to the gestures. Finally, corresponding gesture actions are respectively made on the palm and the fingers according to requirements, and the PIC single chip microcomputer transmits command signals to the raspberry pi through the Bluetooth communication module; and the raspberry group calls a library function of the corresponding action in the flight control module by using a Mallink protocol, so that the unmanned aerial vehicle is controlled to complete the command action. The invention has more accurate gesture recognition and reduces the cost.

Description

Unmanned aerial vehicle gesture control method
Technical Field
The invention belongs to the field of unmanned aerial vehicle control, and particularly relates to an unmanned aerial vehicle gesture control method.
Background
In recent years, the country has more and more taken into account the development of unmanned aerial vehicle, and unmanned aerial vehicle's heat is constantly rising, and each big unmanned aerial vehicle company of domestic and abroad constantly attracts the external investment, and in continuation of the journey, the aspect of operation and safety and stability etc. is constantly innovated for unmanned aerial vehicle more and more tends to stabilization and evolves earlier.
And with the development of artificial intelligence and the arrival of the 5G era, the unmanned aerial vehicle is more intelligent, a distribution station of the Jingdong unmanned aerial vehicle is established, the geographic information company of Dajiang and come card collaboratively develops an aerial photography system and a 5G unmanned aerial vehicle security system, can carry an AI deep learning platform, and can realize multi-scene, multi-person and multi-dimensional rapid video analysis. Moreover, the occupied market proportion of the consumption-level unmanned aerial vehicle exceeds that of military unmanned aerial vehicles, and the consumption-level unmanned aerial vehicle becomes the future market competitive principal.
Simultaneously, consumption level unmanned aerial vehicle requires simple and convenient in the aspect of controlling. However, innovations in unmanned aerial vehicle control are rare both domestically and abroad, and the operation of the unmanned aerial vehicle still depends on a remote controller to a great extent. The operation of remote controller is complicated, and it is inconvenient to control, and requires higher to operator's specialty, to ordinary masses, has certain learning cost, is unfavorable for unmanned aerial vehicle's popularization.
Although gesture control appears in the current unmanned aerial vehicle control, most of the unmanned aerial vehicle control adopts mode recognition or image recognition gestures. The image recognition needs to be provided with a camera module and an image processing chip with certain definition, so that the cost is high; when the distance is long, the precision of the image recognition gesture is not high, and misoperation is easily caused; in some environments (such as dim light, haze weather, etc.), the image recognition effect is poor. The use of image recognition also means that the hand for control must be in the visual field range of the camera module, so that the camera module has great limitation; the same mode recognition gesture has more not enough in unmanned aerial vehicle control field, can't perfectly satisfy unmanned aerial vehicle control's needs.
And a few unmanned aerial vehicles adopt sensors for gesture control to recognize gestures, but only adopt a single sensor. For example, the utility model patent application with publication number CN207718231U discloses "a gesture recognition remote control unmanned aerial vehicle" in 2018, 8, month 10; the gesture action that its can discern is very limited for control unmanned aerial vehicle's instruction is few, hardly realizes the nimble control of unmanned aerial vehicle flight.
Disclosure of Invention
Aiming at the problems, the invention provides a gesture control method of the unmanned aerial vehicle, which is characterized in that a gesture signal is obtained by using a plurality of sensors in a cooperative manner, and the gesture command signal corresponds to the flight control signal one by one so as to control the unmanned aerial vehicle, so that a plurality of gesture actions can be flexibly recognized to realize flexible flight of the unmanned aerial vehicle, and the control flexibility is improved.
The gesture control method of the unmanned aerial vehicle comprises the following specific steps:
firstly, constructing an unmanned aerial vehicle gesture control device on a glove of a right hand, and installing an unmanned aerial vehicle instruction execution module on an unmanned aerial vehicle body;
unmanned aerial vehicle gesture controlling means includes bending sensor, fixed resistance, acceleration sensor, integrated operational amplifier, PIC singlechip and bluetooth communication sending module.
The unmanned aerial vehicle instruction execution module comprises a flight control module, a raspberry pi, an FT4232H conversion module and a Bluetooth communication receiving module.
The number of the bending sensors is 4, the bending sensors are respectively attached to the backs of the rest four fingers except the thumb on the glove and are used for identifying and collecting finger bending signals. The resistor of each bending sensor is connected with a fixed resistor in series to form a voltage division circuit, the change of the resistance value is converted into the change of divided voltage, and the voltage signals are stabilized by a voltage follower connected by an integrated operational amplifier and then input into a PIC single chip microcomputer through an I/O port;
meanwhile, the number of the acceleration sensors is 1, the acceleration sensors are attached to the center of the back of the glove and used for detecting motion acceleration of the hand and components of gravity acceleration of the hand on three axes of the acceleration sensors under different placement states, and then the posture of the hand is identified. Is connected with the PIC singlechip through an I2C protocol; the PIC singlechip is connected with the Bluetooth communication transmitting module through a UART protocol;
bluetooth communication receiving module passes through UART agreement connection raspberry group, and the raspberry group converts command signal into the control signal that unmanned aerial vehicle corresponds flight mode, and the raspberry group simultaneously flies the control module through mavlik agreement connection, flies the control module control unmanned aerial vehicle body and accomplishes various actions.
Further, the raspberry pi is connected with an FT4232H conversion module through a data line as an intermediary.
And step two, judging that a bending sensor in the unmanned aerial vehicle gesture control device is in a straight or bent state by setting a threshold value.
The specific process is as follows:
firstly, setting a threshold ave in a PIC singlechip: ave ═ (a1+ a 2)/2;
a1 is the variable average value corresponding to n times of straightening of the bending sensor, and a2 is the variable average value corresponding to n times of bending of the bending sensor;
then, when the bending degree of the bending sensor is fixed at a certain angle, the corresponding data value exceeds the threshold ave, that is, the bending sensor is considered to be straightened, otherwise, the bending sensor is in a bending state.
Thirdly, judging the inclined posture of the palm according to the acceleration component data of the X axis and the Y axis of an acceleration sensor in the unmanned aerial vehicle gesture control device;
the specific judgment process is as follows:
first, data received by the acceleration sensor is filtered using a kalman algorithm.
Then, when the hand is in a static state, the three acceleration values are components of the gravity acceleration on three coordinate axes; when the posture of the acceleration sensor changes, the three components also change, and when the acceleration component data of the X axis is greater than 0.5 × g, the palm is judged to be left inclined; when the X-axis acceleration component data is < -0.5X g, the palm is judged to be right inclined; when the Y-axis acceleration component data is greater than 0.5 × g, the palm is judged to be inclined upwards, and when the Y-axis acceleration component data is less than-0.5 × g, the palm is judged to be inclined downwards; if the above condition does not occur, the palm level is judged.
Step four, combining the bending information of the bending sensor and the palm state information of the acceleration sensor and inputting the combined information into a PIC single chip microcomputer to obtain gesture states corresponding to the combined information and instructions corresponding to the gestures;
the gesture actions and corresponding instructions are as follows:
1) the palm inclines leftwards, and the unmanned aerial vehicle realizes yawing leftwards under the state that the four fingers are bent;
2) the palm inclines leftwards, and the unmanned aerial vehicle flies leftwards under the state that the four fingers are not bent;
3) the palm inclines rightwards, and the unmanned aerial vehicle realizes right yawing in a four-finger bending state;
4) the palm is rightwards inclined, and the unmanned aerial vehicle flies rightwards under the non-four-finger bending state;
5) when the palm is inclined upwards and one finger is in a straight state, the unmanned aerial vehicle realizes return voyage;
6) the palm is inclined upwards, and the unmanned aerial vehicle ascends under the state that one finger is not straightened;
7) the palm is declined, and the unmanned aerial vehicle realizes forced power-off in a four-finger bending state;
8) the palm is declined, and the unmanned aerial vehicle lands when one finger is in a straight state;
9) the palm is declined, and the unmanned aerial vehicle descends under the state that one finger is not straightened;
10) the palm is horizontal, and the unmanned aerial vehicle can hover when the four fingers are in a straight state;
11) the palm is horizontal, and the unmanned aerial vehicle retreats under the state that four fingers are bent;
12) the palm is horizontal, and the unmanned aerial vehicle advances when one finger is in a straight state;
note: the first finger is straightened, that is, any one of the four fingers is straightened, and the non-first finger is straightened, that is, all the fingers except the first finger are straightened.
Step five, the executor respectively makes corresponding gesture actions on the palm and the fingers according to the requirements, and the PIC single chip transmits the instruction signals to the raspberry pi through the Bluetooth communication module;
and step six, the raspberry group calls a library function of the corresponding action in the flight control module by using a Mallink protocol, so that the unmanned aerial vehicle is controlled to complete the command action.
Compared with the prior art, the invention has the following advantages:
(1) a gesture control method for an unmanned aerial vehicle is characterized in that the unmanned aerial vehicle is controlled by gloves, the operation process is simple and convenient, and the handling difficulty of the unmanned aerial vehicle is reduced;
(2) according to the gesture control method of the unmanned aerial vehicle, the gesture signals are collected through the sensor, the structure is simple and reliable, the environment adaptability is improved, the gesture recognition is more accurate, and meanwhile the cost is reduced;
(3) according to the gesture control method for the unmanned aerial vehicle, the sensors are combined by multiple sensors, more complex gestures can be recognized, the number of gesture instructions in the unmanned aerial vehicle operation is increased, and the operability and flexibility of the unmanned aerial vehicle are improved;
(4) the utility model provides an unmanned aerial vehicle gesture control method, uses the raspberry group to carry out the data transfer in information transmission, still can guarantee the validity of system after changing close type flight control, has guaranteed this control system's portability.
Drawings
FIG. 1 is a flow chart of a method for controlling gestures of an unmanned aerial vehicle according to the present invention;
FIG. 2 is a schematic diagram of a control device based on the gesture control method of the unmanned aerial vehicle of the present invention;
FIG. 3 is a diagram showing a gesture control device installed on a right-hand glove according to the present invention;
FIG. 4 is a schematic diagram of the coordinate correspondence between the acceleration sensor and the glove of the present invention;
FIG. 5 is a diagram of the connections between the chips constructed on a right-handed glove of the present invention;
fig. 6 is a real object display diagram of each chip integrated on the unmanned aerial vehicle body according to the present invention;
fig. 7 is a schematic circuit connection diagram of chips integrated on the unmanned aerial vehicle body according to the present invention;
FIG. 8 is a schematic diagram of command design and corresponding command functions for gesture control according to the present invention;
fig. 9 is a schematic view of part of codes of a raspberry group operation on the unmanned aerial vehicle body according to the present invention;
FIG. 10 is a schematic illustration of a takeoff preparation for an unmanned aerial vehicle of the present invention;
FIGS. 11-13 are diagrams showing the left and right rolling, ascending and descending flight effects of the UAV of the present invention;
fig. 14 is a schematic view of flight parameters fed back in the test flight process of the unmanned aerial vehicle.
Detailed Description
The present invention will be described in further detail and with reference to the accompanying drawings so that those skilled in the art can understand and practice the invention.
The invention relates to a gesture control method of an unmanned aerial vehicle, which comprises the steps that gesture signals are collected by two sensors, a single chip microcomputer reads the sensor signals and judges gesture states, flight instructions corresponding to each effective gesture are stored in a program of the single chip microcomputer, after the current gesture state is judged, the corresponding instruction signals are transmitted to Bluetooth at a glove end through serial port communication, and the signals are sent out wirelessly by the Bluetooth.
At the unmanned aerial vehicle end, at first receive command signal by the pairing bluetooth of unmanned aerial vehicle end, the raspberry group is connected with the bluetooth through serial communication to read bluetooth received command signal, and judge which kind of flight action that this command signal corresponds unmanned aerial vehicle. The raspberry group is connected with the flight control through a Mallink protocol, and the flight control is controlled by calling a library function for controlling corresponding flight actions, so that the unmanned aerial vehicle executes the corresponding flight actions.
As shown in fig. 1, the specific steps are as follows:
firstly, constructing an unmanned aerial vehicle gesture control device on a glove of a right hand, and installing an unmanned aerial vehicle instruction execution module on an unmanned aerial vehicle body;
as shown in fig. 2, the gesture control device of the unmanned aerial vehicle comprises a bending sensor, a fixed resistor, an acceleration sensor, an integrated operational amplifier, a PIC single-chip microcomputer and a bluetooth communication sending module.
As shown in fig. 6, the drone instruction execution module includes a flight control module, a raspberry pi, an FT4232H conversion module, and a bluetooth communication receiving module.
As shown in fig. 3, the bending sensor is a flex 2.2' bending sensor, and 4 sets of sensors are provided; the finger bending signals are respectively attached to the backs of the other four fingers except the thumb on the glove and used for identifying and collecting the finger bending signals. The glove is attached to the finger part of the glove in a cloth sewing mode, so that the glove is convenient to place and take out, longitudinal stress is relieved by longitudinal movement during bending, and data acquisition is more stable and reliable;
each set of sensor adopts the same circuit connection mode: the 5V power port Vcc of the PIC singlechip is connected to one pin of the bending sensor through a wire, and then is connected to a GND pin on the PIC singlechip after being connected with a resistor with a fixed resistance in series through the other pin of the bending sensor to form a voltage division circuit, the divided voltage is led out from between the bending sensor and the fixed resistor and is connected to the positive input of the integrated operational amplifier, and the output pin of the integrated operational amplifier is connected with the negative input pin and then is connected with the I/O port of the PIC singlechip.
Meanwhile, there are 1 acceleration sensor, which is an MPU6050 acceleration sensor, i.e., an MPU6050 inertial sensor, and is used for detecting the motion acceleration of the hand and the components of the gravity acceleration on the three axes x, y, and z of the acceleration sensor in different placement states of the hand, and further identifying the posture of the hand; is connected with the PIC singlechip through an I2C protocol;
the specific installation mode is as follows: the hot melt adhesive is fixedly connected with a large plastic plate, and then the plastic plate and the back of the hand of the glove are adhered by the hot melt adhesive, so that the inertial sensor can closely follow the motion of the hand, and the measurement data is more accurate.
The bending sensor and the acceleration sensor cooperatively acquire various gestures of the gesture and send the gestures to the PIC single chip microcomputer, and the PIC single chip microcomputer is connected with the Bluetooth communication sending module through a UART protocol and sends out various gesture instructions of the gesture;
the Bluetooth communication module is an HC-42 Bluetooth serial port transparent transmission module, namely a BT module;
the PIC singlechip is a CHIPKIT MAX32 singlechip, and is compiled and used after the Arduino is used for installing a corresponding development board package.
A connection diagram of the PIC single-chip microcomputer, the acceleration sensor and the bluetooth communication transmitting module is shown in fig. 5.
The bluetooth communication receiving module who pairs with bluetooth communication sending module on the unmanned aerial vehicle body receives the instruction signal that PIC singlechip sent, sends for FT4232H conversion module through the UART agreement, further transmits the raspberry group of connecting through the data line.
The FT4232H conversion Module is FT4232H Mini Module;
the connection of the Bluetooth communication receiving module and the FT4232H conversion module is shown in FIG. 7;
raspberry group passes through the Mavlink agreement with the flight control module and is connected, sends the flight control module after the control signal who corresponds the flight mode with command signal conversion for unmanned aerial vehicle, controls the unmanned aerial vehicle body and accomplishes the instruction action that corresponds.
The flight control module is pixhawk flight control, and is equivalent to the brain of the unmanned aerial vehicle. In the invention, the flight control working mode is an automatic flight mode, the communication function of flight control Telem2 is opened through a ground station Session Planner, the baud rate is set to 921600, flight control log parameters are configured, and instructions can be received from a raspberry dispatch and executed. The Pixhawk flight control has a plurality of UART interfaces, wherein Telem2 is a uartD number transmission interface and can be used for data transmission of a Mallink protocol.
The Raspberry Pi 3Model B needs to use two sets of UART buses, but the Raspberry Pi 3Model B has 40 pins and only one set of UART bus pin, so that the FT4232H conversion module is needed for connecting Bluetooth, and the module can lead out one USB port of the Raspberry Pi into 4 pairs of UART bus pins. As shown in fig. 7; namely, the Vcc, GND, TX and RX pins of the raspberry pi are correspondingly connected with the Vcc, GND, RX and TX pins in the flight control Telem2 interface.
The FT4232H conversion module is connected with the raspberry through a USB A-micro USB data line and supplies power, and internal line connection of the FT4232H conversion module is needed for normally supplying power to all the FT4232H modules.
And step two, judging that a bending sensor in the unmanned aerial vehicle gesture control device is in a straight or bent state by setting a threshold value.
The change of the bending degree of the bending sensor can be converted into the change of a resistance value, the resistance value is changed along with the bending, the resistance value of a flex 2.2' bending sensor is changed from about 7000 omega to about 13000 omega (the bending degree and the resistance value have approximate positive correlation corresponding relation, but not strict, and larger error, in the device, a threshold value is set up through experiments to carry out binarization on a signal of the bending sensor), the bending sensor and a resistor with a fixed resistance value of 10k are connected in series to form a voltage division circuit, the divided voltage is input into a PIC singlechip analog input end after being stabilized by an integrated operational amplifier, and the bending degree of the bending sensor (finger) can be reflected through the size of a voltage value.
Meanwhile, in consideration of the resistance effect of an input pin of the PIC singlechip, an integrated operational amplifier is added, namely, the integrated operational amplifier tl084cn forms a voltage follower, so that stable, accurate and reliable data acquired by the PIC singlechip are ensured. And reading the voltage data into the PIC singlechip to embody decimal data 0-1023, fixing the bending sensor at a certain angle, and reading out a data value to finish the calibration of the bending sensor.
In the present invention, a threshold value is set for the data value through experiments, and exceeding the threshold value is considered that the bending sensor is straightened, and vice versa, thereby obtaining the posture data of whether the finger part is straightened.
The threshold value is set in the single chip microcomputer, and the threshold values corresponding to different single chip microcomputers are different in size. The method for setting the threshold value through experiments comprises the following steps: a variable A in the single chip microcomputer stores a value which is processed to a certain degree, corresponding to the bending degree of the bending sensor, the value of A corresponding to the n times of straightening states is recorded and an average value a1 is taken, the value of A corresponding to the n times of bending states is recorded and an average value a2 is taken, and then the threshold value is set to ave ═ a1+ a 2)/2.
Thirdly, judging the inclined posture of the palm according to the acceleration component data of the X axis and the Y axis of an acceleration sensor in the unmanned aerial vehicle gesture control device;
the acceleration sensor is a space motion sensor chip, and can acquire three current acceleration components, three rotation angular velocities and a current chip temperature of the acceleration sensor (the angular velocities and the temperature data are not used in the invention, and the acceleration components refer to the components of the acceleration sensor on three axes of a chip fixed coordinate system). The data collected by the acceleration sensor chip can be stored in a register of the acceleration sensor chip.
The PIC single chip microcomputer is connected with the acceleration sensor through an I2C protocol, data of the acceleration sensor occupies 2 bytes of space, the PIC single chip microcomputer needs to read twice, and then bit operation is carried out to obtain complete data.
The MPU6050 acceleration sensor has data with large noise, and filtering is required to be performed in the PIC singlechip by using a Kalman algorithm. When the hand is in a static state, the three read acceleration values are components of the gravity acceleration on three coordinate axes, the values are certain values (-16384, 16384), the acceleration components (-g, g) are corresponding, the values of the three components are calculated/16384 to obtain the value (-1, 1), which is the multiple relation between the acceleration components and the gravity acceleration g. When the attitude of the acceleration sensor changes, the three components also change, and only the acceleration component data of the X axis and the Y axis are used in the invention. Combining the coordinate corresponding relation of the acceleration sensor and the glove as shown in figure 4, considering error tolerance, judging that the palm is left-inclined when the read X-axis acceleration component data is greater than 0.5 g, and judging that the palm is right-inclined when the X-axis acceleration component data is < -0.5 g; when the read Y-axis acceleration component data is greater than 0.5 × g, the palm is judged to be inclined upwards, and when the Y-axis acceleration component data is less than-0.5 × g, the palm is judged to be inclined downwards; if the above condition does not occur, the palm level is judged.
Step four, combining the bending information of the bending sensor and the palm state information of the acceleration sensor and inputting the combined information into a PIC single chip microcomputer to obtain gesture states corresponding to the combined information and instructions corresponding to the gestures;
as shown in fig. 8, a command function is designed and corresponding to the command for gesture control.
The gesture gestures are arranged and correspond to flight actions expected to be completed by the unmanned aerial vehicle one by one, and if the unmanned aerial vehicle keeps static when the palm horizontal five fingers are straightened, the palm upwards inclines and the forefinger unmanned aerial vehicle returns.
The gesture actions and corresponding instructions are as follows:
1) the palm inclines leftwards, and the unmanned aerial vehicle realizes yawing leftwards under the state that the four fingers are bent;
2) the palm inclines leftwards, and the unmanned aerial vehicle flies leftwards under the state that the four fingers are not bent;
3) the palm inclines rightwards, and the unmanned aerial vehicle realizes right yawing in a four-finger bending state;
4) the palm is rightwards inclined, and the unmanned aerial vehicle flies rightwards under the non-four-finger bending state;
5) when the palm is inclined upwards and one finger is in a straight state, the unmanned aerial vehicle realizes return voyage;
6) the palm is inclined upwards, and the unmanned aerial vehicle ascends under the state that one finger is not straightened;
7) the palm is declined, and the unmanned aerial vehicle realizes forced power-off in a four-finger bending state;
8) the palm is declined, and the unmanned aerial vehicle lands when one finger is in a straight state;
9) the palm is declined, and the unmanned aerial vehicle descends under the state that one finger is not straightened;
10) the palm is horizontal, and the unmanned aerial vehicle can hover when the four fingers are in a straight state;
11) the palm is horizontal, and the unmanned aerial vehicle retreats under the state that four fingers are bent;
12) the palm is horizontal, and the unmanned aerial vehicle advances when one finger is in a straight state;
note: the first finger is straightened, that is, any one of the four fingers is straightened, and the non-first finger is straightened, that is, all the fingers except the first finger are straightened.
Step five, the executor respectively makes corresponding gesture actions on the palm and the fingers according to the requirements, and the PIC single chip transmits the instruction signals to the raspberry pi through the Bluetooth communication module;
and step six, the raspberry group calls a library function of the corresponding action in the flight control module by using a Mallink protocol, so that the unmanned aerial vehicle is controlled to complete the command action.
The code running on the raspberry pi is shown in fig. 9, and the raspberry pi serves as a data transfer station from the bluetooth communication module to the flight control module in the system. The raspberry group runs the Linix system, and can read instruction signals from the Bluetooth communication receiving module by using Python programming and send flight action instructions corresponding to the signals to the flight control module. The data channel of the command transmission process sent to the flight control module by the raspberry is serial port communication, and a specific flight program of flight control can be directly called by using a Mallink protocol, so that the unmanned aerial vehicle is controlled to complete command actions.
After each instruction action is executed by the flight control, the flight control sends back the fed-back aircraft state parameters to the raspberry, so as to judge the instruction execution condition, as shown in fig. 14.
Example (b):
after all preparation work before taking off the unmanned aerial vehicle is done, the code on the raspberry is operated, and the Bluetooth communication receiving module of the unmanned aerial vehicle instruction execution part starts to receive the message sent by the gloves:
takeoff and hovering are shown in fig. 10: the unmanned aerial vehicle receives the takeoff instruction of the gloves, takes off to a certain height relatively stably, and when the unmanned aerial vehicle reaches the preset height and hovers automatically for 3 seconds, the unmanned aerial vehicle receives the hovering instruction of the gesture, and hovers at the original height and position.
The left-right rolling (movement) is shown in fig. 11: for safety, the left-right movement speed of the unmanned aerial vehicle is designed to be 0.2m/s, and the unmanned aerial vehicle moves left and right after receiving a gesture command.
The rise and fall are shown in fig. 12: when the unmanned aerial vehicle receives the ascending instruction of the gesture, the unmanned aerial vehicle continuously ascends, and when the unmanned aerial vehicle reaches the preset height, the unmanned aerial vehicle hovers to wait for receiving the next command. When the unmanned aerial vehicle receives the command of descending, the unmanned aerial vehicle slowly descends for a certain height.
Landing is shown in FIG. 13: after the experiment is accomplished, unmanned aerial vehicle receives the landing signal of gesture after, and the height constantly descends, and when being less than 0.2m from the ground height after, gives the outage signal, the unmanned aerial vehicle paddle stall. Unmanned aerial vehicle lands on the meadow safely.
In the process of the power-on experiment, the unmanned aerial vehicle can receive the message according to the movement of the gesture to change the movement state, so that the control requirement is met, and the problem of any safety and communication fault does not occur in the flight process of the unmanned aerial vehicle. The invention is successful, and the motion control of the unmanned aerial vehicle completely leaves the remote controller, which is good for beginners and is not needed to fear the complex operation of the unmanned aerial vehicle.
The gesture recognition system adopts a technical means that a plurality of sensors commonly recognize gesture gestures to further control the unmanned aerial vehicle, acquires gesture command signals by adding the plurality of sensors on the gesture acquisition equipment, sends the command signals to the unmanned aerial vehicle command execution part, and enables the unmanned aerial vehicle to execute commands corresponding to the gestures through flight control, so that the aim that the unmanned aerial vehicle is controlled by the gestures during movement is fulfilled, and the gesture recognition system has the advantages of simplicity in operation, accuracy in gesture recognition, diversity in gesture actions and low cost, and enables the unmanned aerial vehicle to be controlled more simply.

Claims (4)

1. An unmanned aerial vehicle gesture control method is characterized by comprising the following specific steps:
firstly, constructing an unmanned aerial vehicle gesture control device on a glove of a right hand, and installing an unmanned aerial vehicle instruction execution module on an unmanned aerial vehicle body;
the unmanned aerial vehicle gesture control device comprises a bending sensor, a fixed resistor, an acceleration sensor, an integrated operational amplifier, a PIC single chip microcomputer and a Bluetooth communication sending module;
the unmanned aerial vehicle instruction execution module comprises a flight control module, a raspberry pi, an FT4232H conversion module and a Bluetooth communication receiving module;
the number of the bending sensors is 4, the bending sensors are respectively attached to the backs of the rest four fingers except the thumb on the glove and are used for identifying and collecting finger bending signals; the resistor of each bending sensor is connected with a fixed resistor in series to form a voltage division circuit, the change of the resistance value is converted into the change of divided voltage, and the voltage signals are stabilized by a voltage follower connected by an integrated operational amplifier and then input into a PIC single chip microcomputer through an I/O port;
meanwhile, the number of the acceleration sensors is 1, the acceleration sensors are attached to the center of the back of the glove and used for detecting the motion acceleration of the hand and the components of the gravity acceleration of the hand on three axes of the acceleration sensors under different placement states so as to identify the posture of the hand; is connected with the PIC singlechip through an I2C protocol; the PIC singlechip is connected with the Bluetooth communication transmitting module through a UART protocol;
the Bluetooth communication receiving module is connected with the raspberry pie through a UART protocol, the raspberry pie converts the instruction signal into a control signal of a corresponding flight mode of the unmanned aerial vehicle, meanwhile, the raspberry pie is connected with the flight control module through a Mavlik protocol, and the flight control module controls the unmanned aerial vehicle body to complete various actions;
step two, judging whether a bending sensor in the unmanned aerial vehicle gesture control device is in a straight or bent state through setting a threshold value;
thirdly, judging the inclined posture of the palm according to the acceleration component data of the X axis and the Y axis of an acceleration sensor in the unmanned aerial vehicle gesture control device;
the specific judgment process is as follows:
firstly, filtering data received by an acceleration sensor by using a Kalman algorithm;
then, when the hand is in a static state, the three acceleration values are components of the gravity acceleration on three coordinate axes; when the posture of the acceleration sensor changes, the three components also change, and when the acceleration component data of the X axis is greater than 0.5 × g, the palm is judged to be left inclined; when the X-axis acceleration component data is < -0.5X g, the palm is judged to be right inclined; when the Y-axis acceleration component data is greater than 0.5 × g, the palm is judged to be inclined upwards, and when the Y-axis acceleration component data is less than-0.5 × g, the palm is judged to be inclined downwards; if the condition does not occur, judging the palm level;
step four, combining the bending information of the bending sensor and the palm state information of the acceleration sensor and inputting the combined information into a PIC single chip microcomputer to obtain gesture states corresponding to the combined information and instructions corresponding to the gestures;
step five, the executor respectively makes corresponding gesture actions on the palm and the fingers according to the requirements, and the PIC single chip transmits the instruction signals to the raspberry pi through the Bluetooth communication module;
and step six, the raspberry group calls a library function of the corresponding action in the flight control module by using a Mallink protocol, so that the unmanned aerial vehicle is controlled to complete the command action.
2. The method as claimed in claim 1, wherein the raspberry pi is connected to the FT4232H transformation module as an intermediary through a data line.
3. The unmanned aerial vehicle gesture control method of claim 1, wherein the specific process of the second step is as follows:
firstly, setting a threshold ave in a PIC singlechip: ave ═ (a1+ a 2)/2;
a1 is the variable average value corresponding to n times of straightening of the bending sensor, and a2 is the variable average value corresponding to n times of bending of the bending sensor;
then, when the bending degree of the bending sensor is fixed at a certain angle, the corresponding data value exceeds the threshold ave, that is, the bending sensor is considered to be straightened, otherwise, the bending sensor is in a bending state.
4. The method as claimed in claim 1, wherein the gesture actions and corresponding commands in step four are as follows:
1) the palm inclines leftwards, and the unmanned aerial vehicle realizes yawing leftwards under the state that the four fingers are bent;
2) the palm inclines leftwards, and the unmanned aerial vehicle flies leftwards under the state that the four fingers are not bent;
3) the palm inclines rightwards, and the unmanned aerial vehicle realizes right yawing in a four-finger bending state;
4) the palm is rightwards inclined, and the unmanned aerial vehicle flies rightwards under the non-four-finger bending state;
5) when the palm is inclined upwards and one finger is in a straight state, the unmanned aerial vehicle realizes return voyage;
6) the palm is inclined upwards, and the unmanned aerial vehicle ascends under the state that one finger is not straightened;
7) the palm is declined, and the unmanned aerial vehicle realizes forced power-off in a four-finger bending state;
8) the palm is declined, and the unmanned aerial vehicle lands when one finger is in a straight state;
9) the palm is declined, and the unmanned aerial vehicle descends under the state that one finger is not straightened;
10) the palm is horizontal, and the unmanned aerial vehicle can hover when the four fingers are in a straight state;
11) the palm is horizontal, and the unmanned aerial vehicle retreats under the state that four fingers are bent;
12) the palm is horizontal, and the unmanned aerial vehicle advances when one finger is in a straight state;
note: the first finger is straightened, that is, any one of the four fingers is straightened, and the non-first finger is straightened, that is, all the fingers except the first finger are straightened.
CN201911359636.4A 2019-12-25 2019-12-25 Unmanned aerial vehicle gesture control method Pending CN111124126A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911359636.4A CN111124126A (en) 2019-12-25 2019-12-25 Unmanned aerial vehicle gesture control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911359636.4A CN111124126A (en) 2019-12-25 2019-12-25 Unmanned aerial vehicle gesture control method

Publications (1)

Publication Number Publication Date
CN111124126A true CN111124126A (en) 2020-05-08

Family

ID=70502465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911359636.4A Pending CN111124126A (en) 2019-12-25 2019-12-25 Unmanned aerial vehicle gesture control method

Country Status (1)

Country Link
CN (1) CN111124126A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913580A (en) * 2020-08-12 2020-11-10 南京工业职业技术学院 Gesture unmanned aerial vehicle controller based on infrared photoelectricity
CN112971773A (en) * 2021-03-12 2021-06-18 哈尔滨工业大学 Hand motion mode recognition system based on palm bending information
CN113064442A (en) * 2021-03-05 2021-07-02 江苏师范大学 Gesture remote control unmanned aerial vehicle based on ROS
CN113220126A (en) * 2021-05-21 2021-08-06 南京大学 Three-dimensional input interaction device of auto-stereoscopic display based on Mallink protocol
CN114115537A (en) * 2021-11-22 2022-03-01 中国电子科技集团公司第五十四研究所 Gesture control method and system for unmanned system
CN117031924A (en) * 2023-08-01 2023-11-10 之江实验室 Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system
CN117806466A (en) * 2024-03-02 2024-04-02 良基(厦门)自动化设备有限公司 Control mode of gesture control full-automatic dish washer

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203759869U (en) * 2014-03-20 2014-08-06 西南科技大学 Gesture sensing type aircraft remote controller
CN104875890A (en) * 2015-06-01 2015-09-02 中国人民解放军装甲兵工程学院 Four-rotor aircraft
CN108268132A (en) * 2017-12-26 2018-07-10 北京航空航天大学 A kind of gesture identification method and human-computer interaction device based on gloves acquisition

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203759869U (en) * 2014-03-20 2014-08-06 西南科技大学 Gesture sensing type aircraft remote controller
CN104875890A (en) * 2015-06-01 2015-09-02 中国人民解放军装甲兵工程学院 Four-rotor aircraft
CN108268132A (en) * 2017-12-26 2018-07-10 北京航空航天大学 A kind of gesture identification method and human-computer interaction device based on gloves acquisition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程海林等: "《基于弯曲与组合MEMS传感器的数据手套设计》", 《广东科技》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913580A (en) * 2020-08-12 2020-11-10 南京工业职业技术学院 Gesture unmanned aerial vehicle controller based on infrared photoelectricity
CN113064442A (en) * 2021-03-05 2021-07-02 江苏师范大学 Gesture remote control unmanned aerial vehicle based on ROS
CN112971773A (en) * 2021-03-12 2021-06-18 哈尔滨工业大学 Hand motion mode recognition system based on palm bending information
CN113220126A (en) * 2021-05-21 2021-08-06 南京大学 Three-dimensional input interaction device of auto-stereoscopic display based on Mallink protocol
CN113220126B (en) * 2021-05-21 2023-06-09 南京大学 Three-dimensional input interaction device of free three-dimensional display based on Mavlink protocol
CN114115537A (en) * 2021-11-22 2022-03-01 中国电子科技集团公司第五十四研究所 Gesture control method and system for unmanned system
CN117031924A (en) * 2023-08-01 2023-11-10 之江实验室 Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system
CN117806466A (en) * 2024-03-02 2024-04-02 良基(厦门)自动化设备有限公司 Control mode of gesture control full-automatic dish washer
CN117806466B (en) * 2024-03-02 2024-05-28 良基(厦门)自动化设备有限公司 Control mode of gesture control full-automatic dish washer

Similar Documents

Publication Publication Date Title
CN111124126A (en) Unmanned aerial vehicle gesture control method
CN110347266B (en) Space gesture control device based on machine vision
CN206170098U (en) Automation of target can be indoorly followed and thing robot is got
CN112154402A (en) Wearable device, control method thereof, gesture recognition method and control system
CN112203809B (en) Information processing apparatus and method, robot control apparatus and method, and storage medium
CN103294226B (en) A kind of virtual input device and method
CN210109640U (en) Unmanned aerial vehicle gesture control device
CN206877150U (en) A kind of unmanned aerial vehicle control system and unmanned plane
CN111290574B (en) Method and device for controlling unmanned aerial vehicle by using gestures and readable storage medium
Garberoglio et al. Choriboard III: a small and powerful flight controller for autonomous vehicles
CN206795850U (en) A kind of novel intelligent controls anthropomorphic robot
CN211698898U (en) Human-computer interaction system based on finger gesture is resolved
CN110624217A (en) Rehabilitation glove based on multi-sensor fusion and implementation method thereof
CN114815689A (en) Unmanned aerial vehicle for realizing gesture control and control system and control method thereof
CN109358550A (en) A kind of detection device and detection method of communication apparatus
CN107219928A (en) The equipment data acquisition analyzing and method of a kind of opportunity of combat pilot operator identification
CN103336529B (en) Model flight autostabilizer wireless setting regulates the method and apparatus of parameter
CN212623993U (en) Intelligent interactive pen and virtual reality system
CN112426709B (en) Forearm movement posture recognition method, interface interaction control method and device
CN205721358U (en) Robot and control system thereof
CN209765441U (en) Multi-mode dynamic gesture recognition device
CN106406540A (en) Posture sensing device and virtual reality system
CN210402104U (en) Gesture-controlled four-axis aircraft
CN110673624A (en) Aircraft control system and control method thereof
CN220313362U (en) Under-driven bionic smart hand

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200508